Skip to content

GitHamza0206/simba

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Simba Logo

High-Efficiency Customer Service Assistant

License Stars Issues PyPI Downloads npm

Simba - Product Hunt

Twitter Follow

What is Simba?

Simba is an open-source customer service assistant built for teams who need full control over their AI. Unlike black-box solutions, Simba is designed from the ground up around evaluation and customization, so you can measure performance, iterate fast, and tailor the assistant to your exact needs.

Why Simba?

Problem Simba's Solution
Can't measure AI quality Built-in evaluation framework with retrieval and generation metrics
Generic responses Fully customizable RAG pipeline with your own data
Hard to integrate Drop-in npm package for instant website integration
Vendor lock-in Open-source, self-hosted, swap any component

Key Features

  • Evaluation-First Design - Track retrieval accuracy, generation quality, and latency out of the box. Know exactly how your assistant performs.
  • Fully Customizable - Swap embedding models, LLMs, vector stores, chunking strategies, and rerankers. Your pipeline, your rules.
  • npm Package for Easy Integration - Add a customer service chat widget to your website with a single npm install.
  • Modern Dashboard - Manage documents, monitor conversations, and analyze performance from a clean UI.
  • Production-Ready - Streaming responses, async processing, and scalable architecture.

Quick Start

Docker (Recommended)

The fastest way to get Simba running:

git clone https://github.com/GitHamza0206/simba.git
cd simba

Create a .env file:

OPENAI_API_KEY=your_openai_api_key

Run with Docker:

# CPU
DEVICE=cpu make build && make up

# NVIDIA GPU
DEVICE=cuda make build && make up

Visit http://localhost:3000 to access the dashboard.

Manual Installation

If you prefer installing without Docker:

pip install simba-core
simba server
simba front

Development Setup with Claude Code

If you're using Claude Code, you can set up the project with a single command:

/setup --all

This will automatically install all dependencies (Python, frontend, npm package) and start the infrastructure services. Other options:

/setup --backend    # Python dependencies only
/setup --frontend   # Next.js + simba-chat only
/setup --services   # Start Docker infrastructure only

Website Integration

Add Simba to your website with the npm package:

npm install simba-chat-widget
import { SimbaChat } from 'simba-chat-widget';

function App() {
  return (
    <SimbaChat
      apiUrl="https://your-simba-instance.com"
      theme="light"
    />
  );
}

That's it. Your customers now have an AI assistant powered by your knowledge base.

Evaluation & Metrics

Simba tracks what matters:

  • Retrieval Metrics - Precision, recall, relevance scores
  • Generation Metrics - Faithfulness, answer relevancy, latency
  • Conversation Analytics - User satisfaction, resolution rates

Use these metrics to continuously improve your assistant's performance.

Architecture

┌─────────────────┐     ┌─────────────────┐     ┌─────────────────┐
│   Your Website  │────▶│   Simba API     │────▶│  Vector Store   │
│  (npm package)  │     │   (FastAPI)     │     │  (Qdrant/FAISS) │
└─────────────────┘     └─────────────────┘     └─────────────────┘
                               │                        ▲
                               │                        │
                               ▼                        │
                        ┌─────────────────┐     ┌───────┴─────────┐
                        │      LLM        │     │     Celery      │
                        │ (OpenAI/Local)  │     │   (Ingestion)   │
                        └─────────────────┘     └─────────────────┘
                                                        │
                                                        ▼
                                                ┌─────────────────┐
                                                │      Redis      │
                                                │  (Task Queue)   │
                                                └─────────────────┘

Docker Deployment

# CPU
DEVICE=cpu make build && make up

# NVIDIA GPU
DEVICE=cuda make build && make up

Customization Options

Component Options
Vector Store Qdrant, FAISS, Chroma
Embeddings OpenAI, HuggingFace, Cohere
LLM OpenAI, Anthropic, Local models
Reranker Cohere, ColBERT, Cross-encoder
Parser Docling, Unstructured, PyMuPDF

Roadmap

  • Core evaluation framework
  • npm chat widget
  • Streaming responses
  • Multi-tenant support
  • Advanced analytics dashboard
  • Webhook integrations
  • Fine-tuning pipeline

Contributing

We welcome contributions! Fork the repo, create a branch, and submit a PR.

Support

About

OpenSource Production ready Customer service with built in Evals and monitoring

Topics

Resources

License

Stars

Watchers

Forks

Packages

No packages published