Your AI on-call assistant for Docker operations
Open-source, self-hosted AI agent that helps you manage Docker infrastructure through natural language commands via Telegram.
Never miss an incident. Your on-call buddy is always watching. π¨
Why trust an AI with your infrastructure?
OnCallMate is built security-first for on-premise deployment:
- β 100% Self-Hosted - Runs entirely in your network, no cloud dependency
- β
Docker Socket Proxy - Never exposes
/var/run/docker.sockdirectly (read-only by default) - β Zero Telemetry - Your container data never leaves your infrastructure
- β Admin Allowlist - Telegram ID-based access control (no public access)
- β Full Audit Trail - Every AI decision logged to SQLite with timestamp/user/action
- β AI Provider Choice - You control what data is sent to which AI (OpenAI/Claude/local)
- β Open Source - MIT license, audit the code yourself
Default security posture:
- Docker operations: Read-only (list, inspect, logs, stats)
- Write operations: Disabled (require explicit opt-in)
- Network: Isolated (containers on private bridge network)
- π³ Docker Operations - List, inspect, logs, stats, system info
- π€ AI-Powered - Natural language understanding (Claude, GPT, Gemini)
- π¬ Telegram Interface - Control from anywhere, anytime
- π Security First - Docker socket proxy, audit logs, admin-only access
- π Audit Trail - All operations logged to SQLite
- π Extensible - Plugin architecture for providers & channels
- β±οΈ Proactive Scheduler - Periodic container checks + anomaly alerts
- π§ Self-Learning Baseline - Learns normal running-container profile over time
- π§© Configurable Prompts - Prompts are stored in
prompts/*.md(not hard-coded) - π Free AI Options - Multiple AI providers including free tiers (OpenRouter!)
- Docker & Docker Compose
- Telegram Bot Token (create one)
- AI Provider API Key (choose one):
- OpenRouter (get free key) - FREE models available! β
- Anthropic Claude (get key)
- OpenAI (get key)
- Clone the repo
git clone https://github.com/ismailperim/oncallmate.git
cd oncallmate
- Configure environment
cp .env.quickstart .env
nano .env
Fill in your credentials:
TELEGRAM_BOT_TOKEN=your_bot_token_here
TELEGRAM_ADMIN_IDS=123456789
# Use OpenRouter for FREE tier
AI_PROVIDER=openrouter
OPENROUTER_API_KEY=sk-or-your-key-here
OPENROUTER_MODEL=google/gemini-flash-1.5
# Proactive mode
MAIN_CONTACT_ID=123456789
LEARN_MODE_ENABLED=true
LEARN_INTERVAL_MINUTES=15
π‘ Tip: OpenRouter offers FREE models! See PROVIDERS.md for all options.
- Start the agent
docker-compose up -d
- Test it - Open Telegram and message your bot:
/start
/ps
π Done! You're now managing Docker via AI chat.
/start - Welcome message
/help - Show all commands
/ps - List containers
/logs <container> - Show container logs
/stats <container> - Container statistics
/inspect <container> - Detailed container info
/images - List Docker images
/info - Docker system info
/health - Agent health check
Just ask naturally:
- "Show me running containers"
- "What's using the most CPU?"
- "Get the last 50 lines of logs from nginx"
- "List all images"
- "What's the system memory usage?"
OnCallMate
βββ Core
β βββ Agent Engine (AI intent parsing)
β βββ Database (audit + memory)
β βββ Logger
βββ Providers (pluggable infrastructure adapters)
β βββ Docker β
β βββ Docker Swarm π§
β βββ Kubernetes π§
β βββ AWS ECS π§
βββ Channels (communication interfaces)
β βββ Telegram β
β βββ Slack π§
β βββ Discord π§
β βββ Webhooks π§
βββ AI Providers (multiple backends)
βββ OpenRouter β
(FREE!)
βββ Anthropic Claude β
βββ OpenAI GPT β
See PROVIDERS.md for AI provider comparison.
/var/run/docker.sock directly in production!
OnCallMate includes docker-socket-proxy by default:
- β Read-only by default
- β Allowlist-based API filtering
- β Isolated network
To enable write operations (start/stop), edit docker-compose.yml:
socket-proxy:
environment:
- POST=1 # Enable container start/stop
- DELETE=0 # Keep delete disabled
Only Telegram users in TELEGRAM_ADMIN_IDS can control the agent.
All operations are logged to /data/oncallmate.db:
- Who requested what
- When and from where
- Result and approval status
Query audit logs:
docker exec -it oncallmate sqlite3 /data/oncallmate.db \
"SELECT * FROM audit_logs ORDER BY timestamp DESC LIMIT 10;"
npm install
npm run dev
npm run build
npm start
- Create
src/providers/your-provider.ts - Implement the
Providerinterface - Register in
src/index.ts
Example: Docker Provider
- Create
src/channels/your-channel.ts - Implement the
Channelinterface - Register in
src/index.ts
Example: Telegram Channel
- v0.2 - Docker Swarm support + approval workflow
- v0.3 - Proactive learning mode (anomaly detection, scheduled checks)
- v0.4 - Kubernetes provider
- v0.5 - Slack & Discord channels
- v0.6 - RBAC + multi-host support
- v1.0 - Production-ready release
Roadmap details are maintained internally during private incubation.
OnCallMate is 100% free and open-source (MIT License).
AI Provider Costs (per 1000 queries):
| Provider | Model | Cost | Quality |
|---|---|---|---|
| OpenRouter | gemini-flash-1.5 | $0 (FREE/low-cost depending on route) | βββ |
| OpenAI | gpt-4o-mini | ~$0.20 | ββββ |
| Anthropic | claude-sonnet-4 | ~$4.50 | βββββ |
See PROVIDERS.md for detailed pricing.
- PROVIDERS.md - AI provider comparison & setup
- CHANGELOG.md - Release history
- CONTRIBUTING.md - Contribution guidelines
- SECURITY.md - Security policy
Contributions are welcome! Please read CONTRIBUTING.md before opening a PR.
MIT License - see LICENSE
Built with:
- Anthropic Claude / OpenRouter / OpenAI - AI engines
- Dockerode - Docker API client
- node-telegram-bot-api - Telegram bot
- Winston - Logging
Made with π by @ismailperim
OnCallMate - Your AI on-call assistant. Because incidents don't wait for business hours. π¨