A comprehensive, self-hosted tracing and analytics platform for AI agents with real-time monitoring, cost tracking, and usage analytics.
Traces Dashboard |
Trace Details |
Costs & Usage Analytics |
Integration Setup |
Data Privacy & Security
- OpenAI stores all your traces, prompts, and business logic on their servers
- Compliance issues with GDPR, HIPAA, SOC 2 when data leaves your infrastructure
- Solution: Self-host everything. All data stays in your MongoDB instance.
No Data Export
- Can't export tracing data from OpenAI
- No data portability or backups
- Can't run custom analytics
- Solution: Export to CSV, integrate with BI tools, maintain complete backups.
Limited Filtering
- OpenAI offers minimal filtering capabilities
- Can't filter by custom metadata, complex date ranges, or workflow parameters
- Solution: Advanced filtering by any field, metadata, date ranges, execution time, regex search.
Open Source vs Closed
- OpenAI's platform is closed source, no customization
- Solution: 100% open source. Modify, extend, white-label for your needs.
- Advanced trace monitoring with detailed span information and workflow tracking
- Cost analytics with interactive charts and time-series analysis
- API key management with expiration and usage tracking
- Role-based access control (Admin/Read-only)
- Export data to CSV
- Advanced filtering by workflow, metadata, date ranges, and custom fields
- Dark/light theme support
- Self-hosted deployment with Docker
Works with all models supported by Vercel AI SDK:
- OpenAI (GPT-4, GPT-4 Turbo, GPT-4o, GPT-3.5 Turbo, o1, o3-mini)
- Anthropic (Claude 3.5 Sonnet, Claude 3 Opus, Claude 3 Haiku)
- Google (Gemini 2.0 Flash, Gemini 1.5 Pro)
- And 40+ more providers
- ✅ AI SDK (
@ai-sdk/*) - Full support - ⬜ Direct OpenAI SDK - Coming soon
- ✅ TypeScript
- ⬜ Python - Coming soon
Note: Currently requires Vercel AI SDK for tracing.
- Frontend: React, TypeScript, Vite, TailwindCSS, shadcn/ui
- Backend: Node.js, Express, TypeScript
- Database: MongoDB (self-hosted or cloud)
- Monorepo: Turborepo with pnpm
All your tracing data stays in your infrastructure. Deploy with Docker Compose, Kubernetes, or on any VPS/cloud provider.
Before deploying, you'll need:
-
MongoDB Database - You must provide your own MongoDB instance
- MongoDB Atlas (Free tier available)
- Self-hosted MongoDB
- MongoDB Cloud
- Any MongoDB-compatible database
-
Docker & Docker Compose (for containerized deployment)
# 1. Clone the repository
git clone <your-repo-url>
cd openai-tracing
# 2. Create environment file from template
cp .env.example .env
# 3. Edit .env and add your MongoDB connection string
# REQUIRED: Set your MongoDB URI
nano .env
# Example .env:
# MONGODB_URI=mongodb+srv://user:pass@cluster.mongodb.net/traces
# JWT_SECRET=your_random_secret_key_min_32_chars
# VITE_API_URL=http://localhost:3001Important: You must set a valid MONGODB_URI in your .env file before starting the application.
# 4. Build and start services
docker-compose up -d
# 5. View logs
docker-compose logs -fThe application will be available at:
- Frontend: http://localhost:3000
- API: http://localhost:3001
# MongoDB Atlas (recommended for production)
MONGODB_URI=mongodb+srv://<username>:<password>@cluster.mongodb.net/traces
# Local MongoDB
MONGODB_URI=mongodb://localhost:27017/traces
# MongoDB with auth
MONGODB_URI=mongodb://<username>:<password>@localhost:27017/traces
# Docker host (when API is in Docker, MongoDB on host)
MONGODB_URI=mongodb://host.docker.internal:27017/traces- Node.js >= 18
- pnpm >= 9.1.0
- MongoDB instance (local or cloud)
# 1. Install dependencies
pnpm install
# 2. Create environment files
cp .env.example apps/api/.env
cp .env.example apps/client/.env
# 3. Update MongoDB connection string in apps/api/.env
# MONGODB_URI=mongodb://localhost:27017/traces (for local MongoDB)
# or
# MONGODB_URI=mongodb+srv://... (for MongoDB Atlas)
# 4. Start development mode (all services)
pnpm dev
# Or run services individually
cd apps/api && pnpm dev
cd apps/client && pnpm devNote: Make sure your MongoDB instance is accessible before starting the API.
- Visit http://localhost:3000
- You'll be redirected to
/initial-setup - Create your first admin account
- Login and start using the platform
Generate an API key from the dashboard and use it to send trace data:
const response = await fetch('http://localhost:3001/traces/event', {
method: 'POST',
headers: {
'Authorization': 'Bearer ak_<your_api_key>',
'Content-Type': 'application/json',
},
body: JSON.stringify({
data: [
// Your trace and span data
]
})
});To generate sample data for testing:
cd apps/api
# Generate 365 days of data (default)
pnpm seed
# Custom configuration
SEED_DAYS=90 SEED_TRACES_PER_DAY=20 SEED_SPANS_PER_TRACE=8 pnpm seed# Build all apps
pnpm build
# Run API in production
cd apps/api && pnpm start
# Serve client (use nginx or similar)
cd apps/client && pnpm preview# Required: Your MongoDB connection string
MONGODB_URI=mongodb+srv://<user>:<pass>@cluster.mongodb.net/traces
# Required: JWT secret for authentication (min 32 characters recommended)
JWT_SECRET=your_random_secret_key_change_this
# Optional: API URL for frontend (default: http://localhost:3001)
VITE_API_URL=http://localhost:3001
# Optional: API port (default: 3001)
PORT=3001MONGODB_URI=mongodb://localhost:27017/traces
JWT_SECRET=your_secret_key
PORT=3001VITE_API_URL=http://localhost:3001Option 1: MongoDB Atlas (Recommended for production)
- Create free account at MongoDB Atlas
- Create a cluster
- Add database user
- Whitelist IP addresses (or allow all for testing: 0.0.0.0/0)
- Get connection string and add to
.env
Option 2: Local MongoDB
# Install MongoDB locally
brew install mongodb-community # macOS
# or use Docker
docker run -d -p 27017:27017 --name mongodb mongo:8.0
# Connection string
MONGODB_URI=mongodb://localhost:27017/tracesOption 3: Other MongoDB Providers
- MongoDB Cloud
- DigitalOcean Managed MongoDB
- Self-hosted MongoDB server
# Start services
docker-compose up -d
# Stop services
docker-compose down
# View logs
docker-compose logs -f api
docker-compose logs -f client
# Rebuild after code changes
docker-compose up -d --build
# Remove all data and start fresh
docker-compose down -vContributions are welcome! Please feel free to submit a Pull Request.
- Fork the repository
- Create your feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
MIT - See LICENSE file for details



