Professional multi-node agentic system for Handit.ai documentation assistance. Specialized Copilot that guides users through complete Handit.ai setup with intelligent phase detection and tech-stack specific instructions.
- π¦ Router Agent LLM: Classifies queries as Handit.ai-related or off-topic
- β Context Questioner LLM: Intelligently asks for tech stack details when needed
- π Phase Router LLM: Determines which Handit.ai phase user needs (1, 2, or 3)
- π Observability LLM: Phase 1 expert (Tracing/SDK setup) with 7-step complete guide
- π Evaluation LLM: Phase 2 expert (Quality evaluation setup)
- π Self-Improving LLM: Phase 3 expert (Optimization and A/B testing)
- Automatic Language Detection: Responds in user's detected language (Spanish/English)
- Tech Stack Detection: Identifies Python/JavaScript, LangChain/OpenAI, local/cloud
- Phase Prerequisites: Ensures Phase 1 β Phase 2 β Phase 3 progression
- Complete Setup Guides: 7-step implementation with copy-paste code examples
- Built-in Knowledge Base: Direct access to complete Handit.ai documentation
- Specialized LLM Routing: Each LLM is an expert in its specific domain
- Direct Knowledge Access: Built-in handitKnowledgeBase with all documentation
- Multi-LLM Processing: OpenAI GPT with specialized prompts for each expert
- Conversation Management: PostgreSQL-based conversation persistence
- Complete Logging: All intermediate LLM responses included in API response
Before starting, you need:
- Node.js 18+ - Download here
- PostgreSQL 12+ - Download here
- OpenAI API Key - Get it here
- Pinecone Account - Sign up here
# 1. Clone the repository
git clone https://github.com/handit-ai/docs-ai-agent.git
cd docs-ai-agent
# 2. Install dependencies
npm install# Install PostgreSQL (macOS with Homebrew)
brew install postgresql
brew services start postgresql
# Create database and user
psql postgres-- In PostgreSQL console:
CREATE DATABASE handit_ai;
CREATE USER handit_user WITH PASSWORD 'your_secure_password';
GRANT ALL PRIVILEGES ON DATABASE handit_ai TO handit_user;
\q# Run PostgreSQL in Docker
docker run --name handit-postgres \
-e POSTGRES_DB=handit_ai \
-e POSTGRES_USER=handit_user \
-e POSTGRES_PASSWORD=your_secure_password \
-p 5432:5432 \
-d postgres:13
# Verify it's running
docker psThe system now uses a built-in knowledge base (handitKnowledgeBase) with all Handit.ai documentation. Pinecone setup is optional but recommended for future extensibility.
- Go to Pinecone.io
- Click "Sign Up" and create a free account
- Verify your email address
- Log into your Pinecone dashboard
- Go to "API Keys" in the left sidebar
- Click "Create API Key"
- Name it "handit-ai-docs"
- Copy the API key (you'll need this for
.env)
- In Pinecone dashboard, click "Indexes" in sidebar
- Click "Create Index"
- Fill in the details:
- Index Name:
handit-ai-docs - Dimensions:
1536(for OpenAI ada-002 embeddings) - Metric:
cosine - Pod Type:
p1.x1(free tier)
- Index Name:
- Click "Create Index"
- Wait for index to be ready (shows "Ready" status)
Note: The system works without Pinecone as it uses the built-in handitKnowledgeBase.
- Go to OpenAI Platform
- Sign up or log in
- Add payment method (required for API access)
- Go to API Keys page
- Click "Create new secret key"
- Name it "handit-ai-docs"
- Copy the key (starts with
sk-)
# Copy the example environment file
cp env.example .envEdit .env file with your actual credentials:
# OpenAI Configuration
OPENAI_API_KEY=sk-your_actual_openai_api_key_here
OPENAI_MODEL=gpt-4
EMBEDDING_MODEL=text-embedding-ada-002
# Pinecone Configuration (Optional)
PINECONE_API_KEY=your_actual_pinecone_api_key_here
PINECONE_ENVIRONMENT=us-east-1-aws
PINECONE_INDEX_NAME=handit-ai-docs
PINECONE_NAMESPACE=HANDIT
# PostgreSQL Configuration
DB_HOST=localhost
DB_PORT=5432
DB_NAME=handit_ai
DB_USER=handit_user
DB_PASSWORD=your_secure_password
# Server Configuration
PORT=3000
NODE_ENV=development# Create database tables
npm run db:setupIf you don't have this script, manually create the tables:
-- Connect to your database
psql -h localhost -U handit_user -d handit_ai
-- Create conversations table
CREATE TABLE IF NOT EXISTS conversations (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
session_id VARCHAR(255) NOT NULL,
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
);
CREATE TABLE IF NOT EXISTS messages (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
conversation_id UUID REFERENCES conversations(id),
role VARCHAR(50) NOT NULL,
content TEXT NOT NULL,
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
);
CREATE INDEX idx_conversations_session_id ON conversations(session_id);
CREATE INDEX idx_messages_conversation_id ON messages(conversation_id);The system includes a comprehensive built-in knowledge base (handitKnowledgeBase) located in src/config/pinecone.js. This contains all Handit.ai documentation including:
- Phase 1 (Observability): SDK installation, tracing setup, code examples
- Phase 2 (Evaluation): Quality evaluation, evaluators, metrics
- Phase 3 (Self-Improving): Optimization, A/B testing, Release Hub
- Setup guides: Complete step-by-step instructions for Python and JavaScript
- Examples: Copy-paste ready code implementations
// Located in src/config/pinecone.js
const handitKnowledgeBase = [
{
text: "Complete documentation content...",
metadata: {
category: "setup|evaluation|optimization",
phase: "overview|phase_1|phase_2|phase_3",
language: "python|javascript"
}
}
// 14+ comprehensive documents
];No additional setup required - the knowledge base is ready to use!
# Start the development server
npm run dev
# Or start production server
npm startYou should see:
π Server running on port 3000
π Connected to PostgreSQL
π Connected to Pinecone (optional)
π€ AI Service initialized
π Knowledge Base loaded (14 documents)
π― 6-LLM Agentic System ready
curl http://localhost:3000/api/healthExpected response:
{
"status": "healthy",
"timestamp": "2024-07-11T02:00:00.000Z",
"services": {
"database": "connected",
"pinecone": "connected",
"openai": "connected"
}
}curl -X POST http://localhost:3000/api/ai/chat \
-H "Content-Type: application/json" \
-d '{
"message": "How do I setup Handit.ai tracing?",
"sessionId": "test-session-123"
}'Expected response structure:
{
"answer": "Te guiarΓ© paso a paso para configurar Handit.ai exitosamente...",
"sessionId": "test-session-123",
"requiresUserInput": true,
"nextAction": "wait_for_step_confirmation",
"detectedLanguage": "spanish",
"phase": "observability",
"userTechStack": {
"language": "python",
"framework": "langchain",
"environment": "local"
},
"nodeType": "observability_llm_response",
"routingDecisions": {
"routerAgent": "HANDIT_AI",
"contextQuestioner": "no_questions_needed",
"phaseRouter": "OBSERVABILITY"
},
"sources": [
{
"text": "Phase 1: AI Observability setup guide...",
"metadata": {
"category": "setup",
"phase": "phase_1",
"language": "python"
}
}
],
"totalSources": 14
}# Check if PostgreSQL is running
pg_isready -h localhost -p 5432
# Check connection with credentials
psql -h localhost -U handit_user -d handit_ai- Verify API key is correct (if using Pinecone)
- Check environment name matches your index
- Note: System works without Pinecone using built-in knowledge base
- Verify API key is valid
- Check you have credits in your OpenAI account
- Ensure billing is set up
- Check model name in .env (default: gpt-4)
- Check knowledge base is loaded (
handitKnowledgeBasein logs) - Verify all 6 LLMs are responding correctly
- Check language detection is working
- Ensure phase routing is functioning
- Check router agent decisions in logs
- Verify phase router is selecting correct expert
- Ensure tech stack detection is working
- Check conversation history integration
Enable debug logging:
# Set debug environment
DEBUG=* npm run dev
# Or specific modules
DEBUG=ai:*,pinecone:* npm run devCheck logs:
# View server logs
tail -f logs/server.log
# View error logs
tail -f logs/error.logMain endpoint for AI assistance with 6-LLM specialized routing.
Request:
{
"message": "How do I setup Handit.ai observability?",
"sessionId": "optional-session-id"
}Response: Specialized response from appropriate expert LLM with routing decisions
Key Features:
- Automatic routing to correct phase expert (Observability/Evaluation/Self-Improving)
- Language detection (Spanish/English) with consistent responses
- Tech stack detection (Python/JavaScript, LangChain/OpenAI)
- Complete 7-step guides with copy-paste code examples
- Prerequisite handling (Phase 1 β Phase 2 β Phase 3)
Health check endpoint with service status.
Get conversation history.
Clear conversation history.
- 6 Specialized LLMs: Router β Context Questioner β Phase Router β Expert LLMs
- Built-in Knowledge Base: 14+ comprehensive Handit.ai documents
- Smart Questioning: Only asks for tech stack when actually needed
- Complete Implementation: Full code examples with start_tracing, track_node, end_tracing
# Production environment
NODE_ENV=production
PORT=3000
# Database (use connection pooling)
DB_HOST=your-db-host
DB_PORT=5432
DB_NAME=handit_ai_prod
DB_USER=handit_user
DB_PASSWORD=secure_password
DB_SSL=true
# Pinecone (production index)
PINECONE_INDEX_NAME=handit-ai-docs-prod
# Rate limiting
RATE_LIMIT_MAX=100
RATE_LIMIT_WINDOW=60000# Build Docker image
docker build -t handit-ai-docs .
# Run container
docker run -p 3000:3000 \
--env-file .env.production \
handit-ai-docs# Install PM2 for production
npm install -g pm2
# Start with PM2
pm2 start src/server.js --name handit-ai-docs
# Monitor
pm2 monit
# View logs
pm2 logs handit-ai-docs- Always use HTTPS in production
- Implement proper API key rotation
- Set up database connection pooling
- Use environment-specific Pinecone indexes
- Enable request logging and monitoring
- Implement proper error handling (don't expose sensitive info)
- Use Redis for caching
- Implement database read replicas
- Use CDN for static assets
- Monitor Pinecone query costs
- Implement proper logging and metrics
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
MIT License - see LICENSE file for details.