this project used on my profile page: iank.me. Based on mistral AI, OpenAI, Groq, langchain and chroma vector database, I'm still learning, and this is my first project in AI
- OpenAI
- MistralAI
- Groq
- Deepseek
- Ollama
- Huggingface
- Mistral
- Nvidia
- Redis (LLM Cache and setup)
- Postgree (History)
- ChromaDB (Vector Database)
- Minio
Running via poetry and activate virtualenvironment
poetry env activate
Run the serve
poetry run http serve
or
python main.py serve
docker compose up
poetry install
cp env.example .env
please setup APP_ENVIRONTMENT to local if you setup on development mode
Im using alembic migration via poetry custom command
poetry run db:migrate create "your table name"
up migration
poetry run db:migrate up
downgrade migration
poetry run db:migrate down "your revision"
http://localhost:8081/docs
username: admin
password: admin
url:
localhost:8081/ex/v1/chat/ws/{client_id}
payload:
{
"chat": "hello",
"collection": "ocha_v2",
"llm": "openai", // optional
"model": "gpt-4o-mini" // optional
}
graph TB
User((External User))
subgraph "Ochabot System"
subgraph "API Layer"
FastAPI["API Server<br>(FastAPI)"]
Router["Router<br>(FastAPI Router)"]
WebSocket["WebSocket Handler<br>(FastAPI WebSocket)"]
subgraph "API Components"
ChatHandler["Chat Handler<br>(Python)"]
UserHandler["User Handler<br>(Python)"]
IngestHandler["Ingest Handler<br>(Python)"]
PromptHandler["Prompt Handler<br>(Python)"]
SetupHandler["Setup Handler<br>(Python)"]
ClientHandler["Client Handler<br>(Python)"]
LoginHandler["Login Handler<br>(Python)"]
end
end
subgraph "LLM Services"
LLMWrapper["LLM Wrapper<br>(Python)"]
subgraph "LLM Providers"
OpenAI["OpenAI Service<br>(OpenAI API)"]
Mistral["Mistral Service<br>(Mistral API)"]
Groq["Groq Service<br>(Groq API)"]
DeepSeek["DeepSeek Service<br>(DeepSeek API)"]
Ollama["Ollama Service<br>(Ollama API)"]
end
end
subgraph "Data Storage"
PostgreSQL[("PostgreSQL<br>(Primary Database)")]
Redis[("Redis<br>(Cache)")]
MinIO[("MinIO<br>(Object Storage)")]
ChromaDB[("ChromaDB<br>(Vector Store)")]
end
subgraph "Core Services"
DatabaseService["Database Service<br>(SQLAlchemy)"]
VectorService["Vector Store Service<br>(LangChain)"]
CacheService["Cache Service<br>(Redis Stack)"]
StorageService["Storage Service<br>(MinIO)"]
end
end
%% Connections
User -->|"HTTP/WebSocket"| FastAPI
FastAPI -->|"Routes"| Router
Router -->|"Handles WebSocket"| WebSocket
%% API Components connections
Router --> ChatHandler
Router --> UserHandler
Router --> IngestHandler
Router --> PromptHandler
Router --> SetupHandler
Router --> ClientHandler
Router --> LoginHandler
%% LLM Service connections
ChatHandler --> LLMWrapper
LLMWrapper --> OpenAI
LLMWrapper --> Mistral
LLMWrapper --> Groq
LLMWrapper --> DeepSeek
LLMWrapper --> Ollama
%% Data Storage connections
DatabaseService --> PostgreSQL
CacheService --> Redis
StorageService --> MinIO
VectorService --> ChromaDB
%% Service Usage connections
ChatHandler --> DatabaseService
UserHandler --> DatabaseService
IngestHandler --> DatabaseService
PromptHandler --> DatabaseService
SetupHandler --> DatabaseService
ClientHandler --> DatabaseService
LoginHandler --> DatabaseService
ChatHandler --> VectorService
IngestHandler --> VectorService
ChatHandler --> CacheService
UserHandler --> CacheService
IngestHandler --> StorageService