Skip to content

aginrocks/aigin

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

259 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

πŸ€– Aigin

Aigin Logo

Next-generation AI chat platform with extensive model support and powerful extensibility

TypeScript Next.js tRPC Kubernetes


✨ Features

🎯 Core Capabilities

  • Multi-Provider AI Support: Seamlessly switch between OpenAI, Anthropic, Google, Groq, DeepSeek, XAI, Azure, and OpenRouter
  • Real-time Streaming: Live response streaming with delta updates and typing indicators
  • Smart Chat Management: Automatic title generation, conversation history
  • Modern UI/UX: Beautiful dark/light theme with responsive design and smooth animations

πŸ”§ Extensibility (MCP Integration)

  • Plugin Architecture: Extend functionality with Model Context Protocol (MCP) applications
  • Pre-built Integrations: 15+ ready-to-use apps including:
    • πŸ“ Productivity: Notion, Outline, ClickUp, GitHub
    • πŸ’Ύ Databases: MongoDB, with Atlas support
    • 🌐 Web Services: Fetch, TMDB, Chess.com
    • πŸ’¬ Communication: Email client with IMAP/SMTP
    • 🧠 Memory: Persistent conversation memory
    • πŸ”§ System: FurryOS (NixOS package manager), Gitea

πŸ›‘οΈ Enterprise Ready

  • Kubernetes Native: Full containerization with auto-scaling capabilities
  • User Authentication: OIDC integration with multi-user support
  • Resource Management: Configurable CPU/memory limits and persistent storage
  • Security: API key management, user isolation, and secure communication

πŸ—οΈ Architecture

Aigin follows a modern microservices architecture designed for scalability and maintainability:

β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”    β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”    β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚   Next.js App   β”‚    β”‚   Hono Server    β”‚    β”‚   MongoDB       β”‚
β”‚   (Frontend)    │◄──►│   (Backend)      │◄──►│   (Database)    β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜    β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜    β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
         β”‚                        β”‚
         β”‚              β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
         β”‚              β”‚   Kubernetes     β”‚
         β”‚              β”‚   (MCP Apps)     β”‚
         └──────────────┼──────────────────┼───────────────
                        β”‚                  β”‚
                β”Œβ”€β”€β”€β”€β”€β”€β”€β–Όβ”€β”€β”€β”€β”    β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β–Όβ”€β”€β”€β”€β”
                β”‚ MCP App 1  β”‚    β”‚ MCP App N   β”‚
                β”‚ (Pod)      β”‚    β”‚ (Pod)       β”‚
                β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜    β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜

🎨 Frontend (Client)

  • Framework: Next.js 15 with App Router
  • Styling: Shadcn
  • State Management: TanStack Query + tRPC for type-safe API calls
  • UI Components: Radix UI primitives with custom styling
  • Real-time: tRPC subscriptions for live chat updates

⚑ Backend (Server)

  • Runtime: Node.js with Hono web framework
  • API: tRPC for end-to-end type safety
  • AI Integration: Vercel AI SDK with custom provider registry
  • Authentication: OIDC with automatic user provisioning
  • Database: MongoDB with Mongoose ODM

πŸš€ Infrastructure

  • Orchestration: Kubernetes for MCP app deployment
  • Containerization: Docker with multi-stage builds
  • Storage: Persistent volumes for stateful apps
  • Networking: Service mesh with automatic discovery

πŸŽ›οΈ Supported AI Models

Aigin supports 30+ flagship models across multiple providers:

Provider Models Capabilities
OpenAI GPT-4.1, GPT-4o, o3, o4-mini Vision, Reasoning, Tools
Anthropic Claude 3.5/3.7 Sonnet Vision, Files, Analysis
Google Gemini 2.5 Pro Vision, Search, Reasoning
Meta Llama 4 Scout/Maverick Vision, Open Source
DeepSeek v3, R1 Reasoning, Cost-effective
XAI Grok 3, Grok 3 Mini Real-time, Reasoning
OpenRouter 100+ models* Unified access

NOTE: For demo purposes not all OpenRouter models are enabled

🧠 Advanced Features

  • Reasoning Models: Support for o3, DeepSeek R1, Grok 3
  • Vision Capabilities: Image analysis across multiple providers
  • Tool Calling: Function calling with MCP integration
  • Context Management: Smart context window handling
  • Cost Optimization: Automatic model selection based on task

πŸ”Œ MCP Applications

Available Apps

App Type Description
πŸ“ Notion Productivity Note-taking and organization
πŸ“Š Outline Documentation Team knowledge base
βœ… ClickUp Project Management Task and project tracking
πŸ™ GitHub Development Repository management
πŸ—„οΈ MongoDB Database NoSQL database operations
🌐 Fetch Web HTTP requests and web scraping
πŸ“§ Email Communication IMAP/SMTP email client
🧠 Memory AI Enhancement Persistent conversation memory
πŸ“¦ FurryOS System NixOS package management

πŸ› οΈ Adding Custom Apps

  1. Define your app in server/src/constants/apps.ts:
{
  type: 'container/stdio',
  slug: 'my-app',
  name: 'My Custom App',
  description: 'Custom integration',
  configuration: [
    {
      id: 'api_key',
      name: 'API Key',
      description: 'Your API key'
    }
  ],
  environment: [
    {
      variable: 'MY_APP_API_KEY',
      template: '{{api_key}}'
    }
  ],
  image: 'my-org/my-app:latest',
  runCommand: 'my-app-binary'
}
  1. Configure in UI: Users can enable and configure apps through the settings panel

  2. Use in chats: Reference with @{app:my-app} your prompt here

🎨 Core Functions & Components

πŸ”„ Chat Generation System

CachedChat Class (server/src/ai/generation-manager.ts):

  • Manages real-time message streaming
  • Handles tool calling and MCP integration
  • Automatic title generation
  • Database synchronization

🎯 Model Registry (server/src/ai/registry.ts)

Dynamic provider management with middleware:

  • Provider Registration: Auto-discovery of available models
  • Middleware System: Custom handling for different providers
  • Model Wrapping: Unified interface across providers

πŸ“‘ Real-time Communication

tRPC Subscriptions:

  • chat.stream: Live message updates
  • chat.getAll: Chat history with real-time sync
  • Event-driven architecture with user isolation

πŸš€ Advanced Features

πŸ”„ Streaming & Real-time

  • Delta Updates: Character-by-character streaming
  • Event System: Real-time chat status and notifications
  • Auto-scroll: Smart scrolling with user control
  • Typing Indicators: Visual feedback during generation

🧠 Intelligent Features

  • Smart Titles: AI-generated conversation titles
  • Context Awareness: MCP apps provide rich context
  • Model Selection: Automatic optimal model choosing
  • Error Handling: Graceful fallbacks and retries

πŸ“Š Analytics & Monitoring

  • Usage Tracking: Model usage and performance metrics
  • Resource Monitoring: Kubernetes pod health
  • Error Reporting: Comprehensive logging system

πŸ“„ License

This project is licensed under the MIT License - see the LICENSE file for details.

πŸ™ Acknowledgments


Built with ❀️ for the AI community and hate for Vercel

Website

This document was mostly generated using AI heh

About

Next-generation AI chat platform with extensive model support and powerful extensibility

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages