
Thor is an enterprise-grade AI model management gateway that provides unified API access to manage and orchestrate multiple AI models. Compatible with OpenAI format, featuring comprehensive user management, channel management, billing, and monitoring capabilities.
- π§ Unified Management: Support for 20+ AI models with unified access and management
- π° Smart Billing: Precise token-based billing system with cache optimization
- π Real-time Monitoring: Detailed usage statistics and performance monitoring
- π Access Control: Complete user management and access control system
- β‘ High Performance: Distributed caching and load balancing support
- π³ Easy Deployment: One-click Docker deployment
- β User Management - Complete user registration, login, and permission control system
- β Channel Management - Unified access and management for multiple AI service providers
- β Token Management - API key generation, distribution, and permission control
- β Smart Billing - Accurate token billing with cache optimization support
- β Data Analytics - Real-time usage statistics and visual reports
- β Log Auditing - Complete API call logs and error tracking
- β System Configuration - Flexible system parameter configuration
- β Payment Integration - Alipay balance recharge functionality
- β Cache Optimization - Redis distributed cache support
- β Load Balancing - Intelligent channel selection and failover
- β Rate Limiting - API rate limiting based on user groups
- β Real-time Monitoring - System performance and call monitoring
- β Multi-language Support - Chinese/English interface switching
- β OpenAI GPT Series (Function Calling supported)
- β Azure OpenAI (Function Calling supported)
- β Kimi (Moonshot AI) (Function Calling supported)
- β DeepSeek (Function Calling supported)
- β Claude (Anthropic) (Cache billing optimization supported)
- β Baidu Wenxin Yiyan (ErnieBot) (Function Calling supported)
- β Alibaba Tongyi Qianwen (Function Calling supported)
- β Tencent Hunyuan (Function Calling supported)
- β Zhipu AI GLM Series (Function Calling supported)
- β iFlytek Spark Model (Function Calling supported)
- β Ollama (Local deployment open source models)
- β SiliconFlow (Open source model aggregation platform)
- β Volcano Engine (ByteDance Cloud Services)
- β Amazon Bedrock (AWS AI Services)
- β Google Vertex AI (GCP AI Services)
- β Gitee AI (Gitee AI Platform)
- β MiniMax AI (SenseTime Technology)
Database Type | Configuration Value | Description |
---|---|---|
SQLite | sqlite |
Lightweight embedded database, default option |
PostgreSQL | postgresql / pgsql |
Enterprise-grade open source database |
SQL Server | sqlserver / mssql |
Microsoft enterprise database |
MySQL | mysql |
Most popular open source database |
Dameng Database | dm |
Chinese enterprise database |
π‘ Modify the
ConnectionStrings:DBType
configuration inappsettings.json
to switch database types. Database switching will not automatically migrate data.
graph TB
subgraph "User Layer"
U[User/Client]
U -->|API Key| T[Thor Gateway]
end
subgraph "Thor Core"
T -->|Load Balancing| CM[Channel Management]
T -->|Authentication| UM[User Management]
T -->|Billing| BS[Billing System]
T -->|Logging| LS[Log System]
end
subgraph "AI Service Providers"
CM -->|OpenAI Format| O[OpenAI]
CM -->|Azure Format| AZ[Azure OpenAI]
CM -->|Claude Format| C[Anthropic]
CM -->|Chinese APIs| CN[Baidu/Alibaba/Tencent]
CM -->|Open Source| OS[Ollama/SiliconFlow]
end
- Docker 20.10+
- At least 1GB available memory
- 500MB disk space
# Create data directory
mkdir -p ./data
# Start service
docker run -d \
--name thor \
-p 18080:8080 \
-v $(pwd)/data:/data \
-e TZ=Asia/Shanghai \
-e DBType=sqlite \
-e ConnectionStrings:DefaultConnection="data source=/data/token.db" \
-e ConnectionStrings:LoggerConnection="data source=/data/logger.db" \
-e RunMigrationsAtStartup=true \
aidotnet/thor:latest
- Admin Panel: http://localhost:18080
- Default Username:
admin
- Default Password:
admin
- Login to admin panel
- Go to "Channel Management" page
- Click "Create Channel"
- Select AI service provider and enter API key
- Save and test connection
Variable Name | Description | Example Value |
---|---|---|
DBType |
Database type | sqlite / postgresql / mysql / sqlserver |
ConnectionStrings:DefaultConnection |
Main database connection string | data source=/data/token.db |
ConnectionStrings:LoggerConnection |
Log database connection string | data source=/data/logger.db |
CACHE_TYPE |
Cache type | Memory / Redis |
CACHE_CONNECTION_STRING |
Redis connection string | localhost:6379 |
HttpClientPoolSize |
HTTP connection pool size | 100 |
RunMigrationsAtStartup |
Run database migrations at startup | true |
version: '3.8'
services:
thor:
image: aidotnet/thor:latest
ports:
- "18080:8080"
volumes:
- ./data:/data
environment:
- TZ=Asia/Shanghai
- DBType=sqlite
- ConnectionStrings:DefaultConnection=data source=/data/token.db
- ConnectionStrings:LoggerConnection=data source=/data/logger.db
- RunMigrationsAtStartup=true
version: '3.8'
services:
postgres:
image: postgres:15
environment:
POSTGRES_DB: thor
POSTGRES_USER: thor
POSTGRES_PASSWORD: thor123
volumes:
- postgres_data:/var/lib/postgresql/data
thor:
image: aidotnet/thor:latest
ports:
- "18080:8080"
depends_on:
- postgres
environment:
- TZ=Asia/Shanghai
- DBType=postgresql
- ConnectionStrings:DefaultConnection=Host=postgres;Port=5432;Database=thor;Username=thor;Password=thor123
- ConnectionStrings:LoggerConnection=Host=postgres;Port=5432;Database=thor_logger;Username=thor;Password=thor123
- RunMigrationsAtStartup=true
volumes:
postgres_data:
- .NET 8.0 SDK
- Node.js 18+
- Git
# Clone project
git clone https://github.com/AIDotNet/Thor.git
cd Thor
# Restore dependencies
dotnet restore
# Start backend service
cd src/Thor.Service
dotnet run --urls "http://localhost:5000"
# Start frontend dev server (new terminal)
cd lobe
npm install
npm run dev
# Chat completions
curl -X POST http://localhost:18080/v1/chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer YOUR_API_KEY" \
-d '{
"model": "gpt-3.5-turbo",
"messages": [{"role": "user", "content": "Hello!"}]
}'
# Text completions
curl -X POST http://localhost:18080/v1/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer YOUR_API_KEY" \
-d '{
"model": "text-davinci-003",
"prompt": "Once upon a time",
"max_tokens": 100
}'
A: Delete user records in the database, the system will automatically recreate the default admin account.
A: Supports all OpenAI compatible format models, including GPT-3.5, GPT-4, Claude, Wenxin Yiyan, etc.
A: It is recommended to use Nginx reverse proxy to configure HTTPS, Thor itself focuses on API gateway functionality.
A: Switching database types will not automatically migrate data, it is recommended to backup and manually migrate.
- GitHub Issues: Submit Issues
- Documentation: View Detailed Documentation
- Community: Join Discussion
Welcome to submit Issues and Pull Requests to help improve Thor!
- Fork the project
- Create feature branch (
git checkout -b feature/AmazingFeature
) - Commit changes (
git commit -m 'Add some AmazingFeature'
) - Push to branch (
git push origin feature/AmazingFeature
) - Create Pull Request
This project is open source under the MIT License.