Persistent memory and context for LLM conversations.
Chat context dies between sessions. OpenChronicle fixes that — it's an orchestration core that gives any LLM durable memory, explainable routing, and auditable decision history across conversations, sessions, and tools.
- Persistent memory — full-text search (FTS5) with deterministic fallback, pinning, tagging; conversations resume where you left off
- Multi-provider routing — OpenAI, Anthropic, Groq, Gemini, Ollama with pool-based selection and automatic fallback
- Mixture-of-Experts — consensus answers from multiple models via
--moeflag - Streaming responses — token-by-token output with
--no-streamopt-out - Hash-chained event log — tamper-evident audit trail for every decision
- MCP server — 20 tools exposing memory, conversation, and context to any MCP-compatible client (Claude Code, Goose, VS Code)
- HTTP API — 20 REST endpoints mirroring MCP tools, API key auth, rate
limiting, auto-starts with
oc serve - Discord bot — slash commands, session mapping, multi-user isolation
- Scheduler — tick-driven job execution with atomic claim and drift prevention
- Asset management — file storage with SHA-256 dedup and generic linking
- Plugin system — extend with stateless task handlers
- Privacy gate — PII detection (6 categories, Luhn validation) before data leaves your machine
- Hexagonal architecture — enforced by tests, not convention
pip install -e ".[openai]"
oc init
export OPENAI_API_KEY=your_key_here
oc chatThat's it. You're in a persistent conversation with memory, streaming, and full audit trail.
docker pull ghcr.io/openchronicle/core:latest
docker compose run --rm openchronicle chatPersistent volumes: /data (SQLite DB), /config, /plugins, /output.
| Interface | Entry point | Use case |
|---|---|---|
| CLI | oc chat, oc convo ask |
Interactive and scripted use |
| STDIO RPC | oc serve / oc rpc |
Programmatic integration |
| HTTP API | Auto-starts with oc serve |
REST clients, webhooks, web UIs |
| MCP Server | oc mcp serve |
Agent interop (Goose, Claude Code) |
| Discord | oc discord start |
Chat bot with slash commands |
| Provider | Extra | Streaming | Tool Use |
|---|---|---|---|
| OpenAI | .[openai] |
Yes | Yes |
| Anthropic | .[anthropic] |
Yes | Yes |
| Groq | .[groq] |
Yes | Yes |
| Gemini | .[gemini] |
Yes | Yes |
| Ollama | .[ollama] |
Yes | Yes |
| Stub | (built-in) | Yes | Yes |
| Document | Description |
|---|---|
| Architecture | Hexagonal layers, event model, directory tree |
| CLI Commands | Full oc command reference |
| Environment Variables | All ~60 configuration knobs |
| MCP Server Spec | Tool list, transports, integration guide |
| Plugin Guide | Build and register task handlers |
| Design Decisions | Rationale for core subsystems |
| RPC Protocol | JSON-RPC stdio protocol spec |
| Backlog | Roadmap and feature backlog |
See CONTRIBUTING.md.
See SECURITY.md.
AGPL-3.0 — free to use, modify, and share. Network service use requires publishing source under the same license.