Video understanding search API for AI agents.
Search what is shown in videos, not just what is said.
Docs · Quickstart · API Reference · Pricing · GitHub
Web pages are easy for AI agents to search. Video is not.
Most video search today is limited to transcripts — what was said. Cerul goes further by indexing what is shown: slides, charts, product demos, code walkthroughs, whiteboards, and other visual evidence.
Note
Cerul is in active development. The API is live at cerul.ai — get a free API key to start.
Get video search results in seconds:
curl "https://api.cerul.ai/v1/search" \
-H "Authorization: Bearer YOUR_CERUL_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"query": "Sam Altman views on AI video generation tools",
"search_type": "knowledge",
"max_results": 3,
"include_answer": true
}'Example response
{
"results": [
{
"id": "yt_hmtuvNfytjM_1223",
"score": 0.96,
"title": "Sam Altman on the Future of AI Creative Tools",
"description": "OpenAI CEO discusses the implications of AI video generation",
"video_url": "https://www.youtube.com/watch?v=hmtuvNfytjM&t=1223s",
"speaker": "Sam Altman",
"timestamp_start": 1223,
"timestamp_end": 1345,
"answer": "Altman believes AI video generation will democratize filmmaking..."
}
]
}| Feature | Description | |
|---|---|---|
| Visual Retrieval | Beyond transcripts | Index slides, charts, demos, and on-screen content — not just speech |
| Two Search Tracks | broll + knowledge |
Semantic footage search and deep knowledge retrieval on shared infra |
| Agent-Ready | Built for LLMs | Designed for tool-use and function calling — clean JSON in, clean JSON out |
| Timestamp Precision | Frame-accurate results | Every result comes with exact start/end timestamps and confidence scores |
| Installable Skills | Codex & Claude | Drop-in agent skills with direct HTTP access — no MCP needed |
| Open Core | Apache 2.0 | Application code, pipelines, and agent integrations are open source |
frontend/ Next.js app — landing page, docs, dashboard
backend/ FastAPI service — API layer and core logic
workers/ Indexing pipelines — video ingestion and processing
docs/ Architecture, API specs, and runbooks
db/ Migrations and seed data
skills/ Agent skills for Codex / Claude-style clients
config/ YAML config defaults and templates
scripts/ Bootstrap and utility scripts
# Quick start — install deps, run database migrations, and start both servers
./rebuild.sh
# Apply SQL migrations manually (useful for remote or one-off database updates)
./scripts/migrate-db.sh
# Or run frontend and backend separately
pnpm --dir frontend dev
backend/.venv/bin/python -m uvicorn app.main:app --app-dir backend --reloadFull command reference
Frontend
pnpm --dir frontend install
pnpm --dir frontend dev
pnpm --dir frontend lint
pnpm --dir frontend test
pnpm --dir frontend buildBackend
python3 -m venv backend/.venv
backend/.venv/bin/python -m pip install -r backend/requirements.txt
# Optional: run migrations explicitly when you only need a schema update
./scripts/migrate-db.sh
backend/.venv/bin/python -m uvicorn app.main:app --app-dir backend --reload --host 127.0.0.1 --port 8000
backend/.venv/bin/pytest backend/testsWorkers
python3 -m venv workers/.venv
workers/.venv/bin/python -m pip install -r workers/requirements.txt
workers/.venv/bin/pytest workers/tests
workers/.venv/bin/python workers/worker.py --db-url "$DATABASE_URL"
workers/.venv/bin/python workers/scheduler.py --once --database-url "$DATABASE_URL"Deploy the frontend on Vercel:
- Import the repository and set Root Directory to
frontend - Keep the included
frontend/vercel.json - Optionally set
NEXT_PUBLIC_SITE_URLfor custom domain metadata
For backend or worker deployments, run ./scripts/migrate-db.sh once against the target
database as a release/predeploy step before rolling out code that depends on the new schema.
- Shared platform backbone: auth, API keys, usage tracking, rate limiting, dashboard, and docs
- End-to-end
brollindexing and search flow on the shared retrieval stack -
knowledgeretrieval, answer generation, and step-pipeline ingestion - Agent-facing integrations via installable skills and direct HTTP access
- Higher-scale production validation for ingestion coverage and retrieval quality
- Stripe billing validation in test mode
- Python & TypeScript SDKs, only if direct API + skill access proves insufficient
Licensed under Apache 2.0.
