- Multi-Provider AI Support: Seamlessly switch between OpenAI, Anthropic, Google, Groq, DeepSeek, XAI, Azure, and OpenRouter
- Real-time Streaming: Live response streaming with delta updates and typing indicators
- Smart Chat Management: Automatic title generation, conversation history
- Modern UI/UX: Beautiful dark/light theme with responsive design and smooth animations
- Plugin Architecture: Extend functionality with Model Context Protocol (MCP) applications
- Pre-built Integrations: 15+ ready-to-use apps including:
- π Productivity: Notion, Outline, ClickUp, GitHub
- πΎ Databases: MongoDB, with Atlas support
- π Web Services: Fetch, TMDB, Chess.com
- π¬ Communication: Email client with IMAP/SMTP
- π§ Memory: Persistent conversation memory
- π§ System: FurryOS (NixOS package manager), Gitea
- Kubernetes Native: Full containerization with auto-scaling capabilities
- User Authentication: OIDC integration with multi-user support
- Resource Management: Configurable CPU/memory limits and persistent storage
- Security: API key management, user isolation, and secure communication
Aigin follows a modern microservices architecture designed for scalability and maintainability:
βββββββββββββββββββ ββββββββββββββββββββ βββββββββββββββββββ
β Next.js App β β Hono Server β β MongoDB β
β (Frontend) βββββΊβ (Backend) βββββΊβ (Database) β
βββββββββββββββββββ ββββββββββββββββββββ βββββββββββββββββββ
β β
β ββββββββββββββββββββ
β β Kubernetes β
β β (MCP Apps) β
ββββββββββββββββΌβββββββββββββββββββΌβββββββββββββββ
β β
βββββββββΌβββββ ββββββββββΌβββββ
β MCP App 1 β β MCP App N β
β (Pod) β β (Pod) β
ββββββββββββββ βββββββββββββββ
- Framework: Next.js 15 with App Router
- Styling: Shadcn
- State Management: TanStack Query + tRPC for type-safe API calls
- UI Components: Radix UI primitives with custom styling
- Real-time: tRPC subscriptions for live chat updates
- Runtime: Node.js with Hono web framework
- API: tRPC for end-to-end type safety
- AI Integration: Vercel AI SDK with custom provider registry
- Authentication: OIDC with automatic user provisioning
- Database: MongoDB with Mongoose ODM
- Orchestration: Kubernetes for MCP app deployment
- Containerization: Docker with multi-stage builds
- Storage: Persistent volumes for stateful apps
- Networking: Service mesh with automatic discovery
Aigin supports 30+ flagship models across multiple providers:
| Provider | Models | Capabilities |
|---|---|---|
| OpenAI | GPT-4.1, GPT-4o, o3, o4-mini | Vision, Reasoning, Tools |
| Anthropic | Claude 3.5/3.7 Sonnet | Vision, Files, Analysis |
| Gemini 2.5 Pro | Vision, Search, Reasoning | |
| Meta | Llama 4 Scout/Maverick | Vision, Open Source |
| DeepSeek | v3, R1 | Reasoning, Cost-effective |
| XAI | Grok 3, Grok 3 Mini | Real-time, Reasoning |
| OpenRouter | 100+ models* | Unified access |
NOTE: For demo purposes not all OpenRouter models are enabled
- Reasoning Models: Support for o3, DeepSeek R1, Grok 3
- Vision Capabilities: Image analysis across multiple providers
- Tool Calling: Function calling with MCP integration
- Context Management: Smart context window handling
- Cost Optimization: Automatic model selection based on task
| App | Type | Description |
|---|---|---|
| π Notion | Productivity | Note-taking and organization |
| π Outline | Documentation | Team knowledge base |
| β ClickUp | Project Management | Task and project tracking |
| π GitHub | Development | Repository management |
| ποΈ MongoDB | Database | NoSQL database operations |
| π Fetch | Web | HTTP requests and web scraping |
| π§ Email | Communication | IMAP/SMTP email client |
| π§ Memory | AI Enhancement | Persistent conversation memory |
| π¦ FurryOS | System | NixOS package management |
- Define your app in
server/src/constants/apps.ts:
{
type: 'container/stdio',
slug: 'my-app',
name: 'My Custom App',
description: 'Custom integration',
configuration: [
{
id: 'api_key',
name: 'API Key',
description: 'Your API key'
}
],
environment: [
{
variable: 'MY_APP_API_KEY',
template: '{{api_key}}'
}
],
image: 'my-org/my-app:latest',
runCommand: 'my-app-binary'
}-
Configure in UI: Users can enable and configure apps through the settings panel
-
Use in chats: Reference with
@{app:my-app} your prompt here
CachedChat Class (server/src/ai/generation-manager.ts):
- Manages real-time message streaming
- Handles tool calling and MCP integration
- Automatic title generation
- Database synchronization
Dynamic provider management with middleware:
- Provider Registration: Auto-discovery of available models
- Middleware System: Custom handling for different providers
- Model Wrapping: Unified interface across providers
tRPC Subscriptions:
chat.stream: Live message updateschat.getAll: Chat history with real-time sync- Event-driven architecture with user isolation
- Delta Updates: Character-by-character streaming
- Event System: Real-time chat status and notifications
- Auto-scroll: Smart scrolling with user control
- Typing Indicators: Visual feedback during generation
- Smart Titles: AI-generated conversation titles
- Context Awareness: MCP apps provide rich context
- Model Selection: Automatic optimal model choosing
- Error Handling: Graceful fallbacks and retries
- Usage Tracking: Model usage and performance metrics
- Resource Monitoring: Kubernetes pod health
- Error Reporting: Comprehensive logging system
This project is licensed under the MIT License - see the LICENSE file for details.
- Vercel AI SDK for AI integration
- Model Context Protocol for extensibility
- shadcn/ui for beautiful components
- tRPC for type-safe APIs
This document was mostly generated using AI heh