trawl is a high-performance, On-Premise AI Powered Research Assistant with terminal interface. Inspired by modern search assistants like Perplexity, it brings deep-dive research capabilities directly to your command line.
- Streaming Responses: Real-time markdown streaming with live spinners and status updates in the terminal.
- DeepSearch Mode: Analyze complex queries by automatically splitting them into 3 verticals, researching 45+ sources simultaneously, and synthesizing them into a comprehensive long-form response.
- Stable Research UI: A structured terminal layout using
Richto show research progress, sources, and media links without flickering. - Visual Research Insights: Automatic detection and display of relevant images and videos discovered during research.
- Integrated Configuration: Manage your provider (Gemini/Ollama), API keys, and models directly from the CLI.
- Source Citations: Clean display of research sources with clickable links (where supported by the terminal).
- Persistent Threads: Full chat history support powered by PostgreSQL and SQLAlchemy.
- Premium TUI: A sleek, customizable terminal interface with both Light and Dark themes.
- Fast-API Backend: Robust, asynchronous backend architecture for multi-step research.
- Vector Search: pgvector-powered semantic search for efficient document retrieval.
Here's what's coming next to make trawl even more powerful. Contributions towards any of these are especially welcome!
| Status | Feature | Description |
|---|---|---|
| [ ] | More LLM Providers | Support for Anthropic, OpenAI, Mistral, and other popular providers beyond. |
| [x] | Research Modes | Multiple depth levels (General search and DeepSearch) for comprehensive research. |
| [ ] | Academic Search | Specialized research mode preferring academic sources like arXiv, PubMed, and Scholar. |
| [ ] | Follow-up Question Suggestions | After each answer, trawl will surface related questions you can instantly continue the thread with. |
| [ ] | File Upload & Analysis | Drop in a PDF, DOCX, or folder and ask questions against your local files — combined with live web search. |
- Frontend: Textual (Rich TUI Framework)
- Backend: FastAPI (Asynchronous Python Web Framework)
- Database: PostgreSQL with pgvector and SQLAlchemy ORM
- LLM Engine: Integration with Google Gemini / Ollama
- Search Engine: SearxNG
- Embeddings: Sentence Transformers
The fastest way to get trawl up and running is using Docker. This will start the backend, database, and search engine automatically.
# 1. Clone and enter the repository
git clone https://github.com/udaykumar-dhokia/trawl.git
cd trawl
# 2. Start the stack (this includes Ollama, PostgreSQL, and SearXNG)
docker compose up -dOnce the containers are running, you can use the trawl CLI to perform research or manage configuration.
# Run research through the Docker backend
docker exec -it trawl_backend uv run trawl research "How can we create AI Agents with LangChain?"
# View configuration inside Docker
docker exec -it trawl_backend uv run trawl config view- Python 3.10+
- uv package manager
- PostgreSQL database
- SearxNG instance (optional, for web search)
git clone https://github.com/udaykumar-dhokia/trawl.git
cd trawl
uv sync-
Copy the environment template:
cp .env.example .env
-
Edit
.envwith your database and API keys.
# Basic research query
trawl research "Your query here"
# Short flag
trawl research --q "Your query here"
# DeepSearch mode (More sources, 3 vertical analysis, comprehensive answer)
trawl research "Your query here" --deep# View current settings
trawl config view
# Update a setting
trawl config set provider google
trawl config set google_api_key YOUR_API_KEY# Start the interactive dashboard
trawl tuiimport asyncio
from trawl.services.invoke_chat import invoke_chat
async def main():
async for chunk in invoke_chat(query="How do bees fly?"):
print(chunk)
if __name__ == "__main__":
asyncio.run(main())# Run the development setup script
./scripts/setup-dev.sh
# Or manually
uv sync --dev
cp .env.example .env# Run all tests
uv run pytest
# With coverage
uv run pytest --cov=trawl
# Run specific test
uv run pytest tests/test_specific.py# Lint code
uv run ruff check .
# Format code
uv run ruff format .
# Type checking
uv run mypy .# Build package
uv build
# Or using make
make buildtrawl/
├── cli.py # Command-line interface
├── main.py # FastAPI application
├── tui_app.py # Textual TUI interface
├── core/ # Core configuration and utilities
├── db/ # Database models and connections
├── models/ # SQLAlchemy models
├── schemas/ # Pydantic schemas
├── services/ # Business logic services
└── utils/ # Utility functions
tests/ # Test suite
docs/ # Documentation
examples/ # Usage examples
scripts/ # Development scripts
.github/ # GitHub Actions and templates
We welcome contributions! Please see our Contributing Guide for details.
- Fork the repository
- Create a feature branch
- Make your changes
- Add tests
- Submit a pull request
This project is licensed under the MIT License - see the LICENSE file for details.
See CHANGELOG.md for version history.
- 📖 Documentation
- 🐛 Issues
- 💬 Discussions
| Shortcut | Action |
|---|---|
Ctrl + N |
New Chat |
Ctrl + R |
Refresh Chat List |
Escape |
Blur / Exit Input |
Ctrl + C |
Quit Application |
Built with ❤️ by udthedeveloper
