A scientifically rigorous tracking system for how often Claude Code validates my life choices.
This code powers the https://cc.cengizhan.com/ website.
Originally forked from yoavf/absolutelyright which powers absolutelyright.lol
- Rust/Axum backend with SQLite
- Simple frontend with roughViz charts
- Basic "absolutely right" and "right" pattern tracking
Backend Migration (Rust → Python)
- Converted from Rust/Axum to Python/FastAPI
- Switched to async SQLAlchemy with SQLite
- Maintained API compatibility for seamless migration
- Added support for dynamic pattern tracking
Enhanced Pattern Tracking
- Extended beyond "absolutely" and "right" to include:
- "Perfect!" - Tracking perfect responses
- "Excellent!" - Tracking excellent responses
- Configurable patterns via
scripts/patterns_config.json - Backend now accepts any pattern as a field in the API
Infrastructure & Deployment
- Added Docker support with multi-stage builds
- Railway deployment configuration
- Persistent volume setup for SQLite database
- Environment-based port configuration
Automation & Monitoring
- Real-time watcher script that monitors Claude Code conversations
- Backfill script to import historical data
- macOS LaunchAgent integration for automatic background monitoring
- Installation script for easy setup
- Upload logging to track sync operations
Data Tracking Improvements
- Total messages per day tracking
- Per-message pattern matching (not per-occurrence)
- Duplicate message ID detection
- Project-based breakdown of counts
- Frontend (
frontend/) - Minimal HTML + JS, with charts drawn using roughViz - Backend (
src/) - Python/FastAPI server with async SQLite storage - Scripts (
scripts/) - Python tools to collect and upload counts from Claude Code sessions - Docker - Containerized deployment ready for Railway or any Docker host
Currently Tracking:
- Times Claude Code said you're "absolutely right"
- Times Claude Code said you're just "right"
- Times Claude Code said "Perfect!"
- Times Claude Code said "Excellent!"
- Total messages per day
- Python 3.11+
- Claude Code installed locally
- (Optional) Railway account for deployment
# Clone the repository
git clone https://github.com/hancengiz/absolutelyright-claude-code.git
cd absolutelyright-claude-code
# Create virtual environment (recommended)
python3 -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
# Install dependencies
pip install -r requirements.txtEdit scripts/patterns_config.json to customize what patterns to track:
{
"server_url": "http://localhost:3003",
"patterns": {
"absolutely": "You(?:'re| are) absolutely right",
"right": "You(?:'re| are) right",
"perfect": "Perfect!",
"excellent": "Excellent!",
"custom_pattern": "Your custom regex here"
}
}# Start the server
python src/main.py
# Or use uvicorn directly with auto-reload
uvicorn src.main:app --host 0.0.0.0 --port 3003 --reload
# Visit http://localhost:3003The database (counts.db) will be created automatically on first run.
# This will scan your ~/.claude/projects directory
# and import all historical "absolutely right" counts
cd scripts
python3 backfill.py --upload http://localhost:3003This script will:
- Scan all your Claude Code project
.jsonlfiles - Extract pattern matches from assistant messages
- Prompt for confirmation before uploading
- Upload daily aggregated counts to your local server
For continuous monitoring of new Claude Code conversations:
- Generate a secret key:
openssl rand -base64 32
# Save this secret for use in the next steps- Create LaunchAgent plist file:
# Create the LaunchAgents directory if it doesn't exist
mkdir -p ~/Library/LaunchAgents
# Create the plist file
cat > ~/Library/LaunchAgents/com.cengizhan.absolutelyright.watcher.plist << 'EOF'
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
<key>Label</key>
<string>com.cengizhan.absolutelyright.watcher</string>
<key>ProgramArguments</key>
<array>
<string>/usr/bin/python3</string>
<string>/Users/YOUR_USERNAME/code/absolutelyright-claude-code/scripts/watcher.py</string>
<string>--secret</string>
<string>YOUR_SECRET_HERE</string>
</array>
<key>WorkingDirectory</key>
<string>/Users/YOUR_USERNAME/code/absolutelyright-claude-code</string>
<key>StandardOutPath</key>
<string>/Users/YOUR_USERNAME/code/absolutelyright-claude-code/logs/watcher.log</string>
<key>StandardErrorPath</key>
<string>/Users/YOUR_USERNAME/code/absolutelyright-claude-code/logs/watcher.error.log</string>
<key>RunAtLoad</key>
<true/>
<key>KeepAlive</key>
<true/>
<key>ProcessType</key>
<string>Background</string>
</dict>
</plist>
EOF- Edit the plist file to add your details:
# Replace YOUR_USERNAME with your actual username
sed -i '' "s/YOUR_USERNAME/$(whoami)/g" ~/Library/LaunchAgents/com.cengizhan.absolutelyright.watcher.plist
# Replace YOUR_SECRET_HERE with your generated secret (manually edit the file)
nano ~/Library/LaunchAgents/com.cengizhan.absolutelyright.watcher.plist- Load and start the LaunchAgent:
launchctl load ~/Library/LaunchAgents/com.cengizhan.absolutelyright.watcher.plistFor testing or one-time use:
# Run in background with nohup
nohup python3 scripts/watcher.py --secret "YOUR_SECRET" >> logs/watcher.log 2>&1 &Note: This will not survive reboots. Use the LaunchAgent method for persistent monitoring.
# Check if LaunchAgent is loaded
launchctl list | grep absolutelyright
# Check if process is running
ps aux | grep watcher.py | grep -v grep
# View logs (logs rotate daily, keeping 7 days)
tail -f logs/watcher.log
tail -f logs/watcher.error.log
# View upload history
tail -f logs/uploads.log# Stop the watcher
launchctl unload ~/Library/LaunchAgents/com.cengizhan.absolutelyright.watcher.plist
# Start the watcher
launchctl load ~/Library/LaunchAgents/com.cengizhan.absolutelyright.watcher.plist
# Restart the watcher
launchctl unload ~/Library/LaunchAgents/com.cengizhan.absolutelyright.watcher.plist && \
launchctl load ~/Library/LaunchAgents/com.cengizhan.absolutelyright.watcher.plistWhat the watcher does:
- Monitors Claude Code conversations every 2 seconds
- Detects pattern matches in real-time
- Uploads counts to your server with the secret key
- Runs automatically on system startup
- Restarts automatically if it crashes
- Logs rotate daily at midnight (keeps 7 days)
-
Create a Railway project
- Connect your GitHub repository
- Railway will auto-detect the Dockerfile
-
Configure environment variables:
PORT=3003 # Railway will override this with its own PORT ABSOLUTELYRIGHT_SECRET=YOUR_GENERATED_SECRET DATABASE_PATH=/app/data/counts.db # Optional, defaults to counts.db -
Add a volume:
- Mount point:
/app/data - This ensures your SQLite database persists across deployments
- Mount point:
-
Deploy:
- Railway will automatically build and deploy
- Health check endpoint:
/api/today
-
Update your local config: Edit
scripts/patterns_config.json:{ "server_url": "https://your-app.railway.app", ... } -
Reinstall watcher with new URL:
./scripts/install-watcher.sh "YOUR_SECRET"
absolutelyright/
├── frontend/ # Static web frontend
│ ├── index.html # Main page
│ ├── frontend.js # Data fetching and chart rendering
│ └── style.css # Styling
├── src/ # Python backend
│ ├── main.py # FastAPI application and routes
│ ├── models.py # SQLAlchemy models
│ └── database.py # Database configuration
├── scripts/ # Data collection tools
│ ├── backfill.py # Import historical data
│ ├── watcher.py # Real-time monitoring
│ ├── claude_counter.py # Core counting logic
│ ├── patterns_config.json # Pattern definitions
│ └── install-watcher.sh # macOS service installer
├── Dockerfile # Container configuration
├── railway.json # Railway deployment config
├── requirements.txt # Python dependencies
└── counts.db # SQLite database (auto-created)
Returns today's pattern counts (aggregated across all workstations).
Response:
{
"absolutely": 3,
"right": 2,
"perfect": 15,
"excellent": 8,
"total_messages": 450
}Cache: 1 minute
Returns all historical data, ordered by date (aggregated across all workstations).
Response:
[
{
"day": "2025-10-22",
"absolutely": 4,
"right": 4,
"perfect": 47,
"excellent": 12,
"total_messages": 1057
},
...
]Cache: 5 minutes
Returns data grouped by workstation for debugging and inspection.
Response:
[
{
"workstation_id": "cengizs-MacBook-Pro",
"history": [
{
"day": "2025-10-22",
"absolutely": 4,
"right": 4,
"perfect": 47,
"excellent": 12,
"total_messages": 1057
},
...
]
},
...
]Cache: 1 minute
Use cases:
- Debug which workstation contributed data
- Verify all machines are uploading correctly
- Inspect data distribution across workstations
Upload counts for a specific day and workstation.
Request:
{
"day": "2025-10-25",
"workstation_id": "cengizs-MacBook-Pro",
"absolutely": 3,
"right": 2,
"perfect": 75,
"excellent": 20,
"total_messages": 2024,
"secret": "YOUR_SECRET"
}Response:
"ok"Notes:
- Requires
ABSOLUTELYRIGHT_SECRETenvironment variable to be set - Requires
workstation_idfield (automatically added by scripts) - Supports dynamic pattern fields (any numeric field becomes a pattern)
- Each workstation's data is stored separately and aggregated on read
- Legacy fields
countandright_countare supported for backward compatibility
# Format Python code
black src/
# Lint Python code
ruff check src/
# or
pylint src/CREATE TABLE day_counts (
day VARCHAR NOT NULL, -- Date in YYYY-MM-DD format
workstation_id VARCHAR NOT NULL, -- Machine identifier (e.g., "cengizs-MacBook-Pro")
patterns TEXT NOT NULL, -- JSON object with pattern counts
total_messages INTEGER, -- Total assistant messages that day
PRIMARY KEY (day, workstation_id) -- Composite key for multi-workstation support
);Example patterns JSON:
{
"absolutely": 4,
"right": 4,
"perfect": 47,
"excellent": 12
}Multi-Workstation Support:
- Each workstation's data is stored separately
- API endpoints (
/api/today,/api/history) automatically aggregate across all workstations - Workstation ID is automatically detected (macOS LocalHostName or hostname)
- No data conflicts when running from multiple machines
PORT- Server port (default: 3003)ABSOLUTELYRIGHT_SECRET- API secret for uploads (optional, but recommended)DATABASE_PATH- Custom database file path (default:counts.db)CLAUDE_PROJECTS- Claude projects directory (default:~/.claude/projects)
The database is created automatically. If you see errors:
# Check database file exists
ls -la counts.db
# Check permissions
chmod 644 counts.db# Check if service is running
launchctl list | grep absolutelyright
# Check logs for errors
tail -f ~/Library/Logs/absolutelyright-watcher.error.log
# Restart service
launchctl unload ~/Library/LaunchAgents/com.absolutelyright.watcher.plist
launchctl load ~/Library/LaunchAgents/com.absolutelyright.watcher.plist- Ensure
ABSOLUTELYRIGHT_SECRETenvironment variable is set - Check that volume is mounted at
/app/data - Verify health check endpoint
/api/todayis accessible - Check deployment logs in Railway dashboard
- Verify Claude Code projects exist at
~/.claude/projects - Check that
.jsonlfiles contain data - Run backfill manually to see what's found:
python3 scripts/backfill.py # Don't use --upload to just see counts
The production server at cc.cengizhan.com has an automated backup system:
- Schedule: Runs every 6 hours (00:00, 06:00, 12:00, 18:00 UTC)
- Location: Backups are committed to the
backups/directory in the repository - Format: JSON file (
database.json) containing all workstation data - Retention: Git history preserves all backup versions
How it works:
- Cron job runs
scripts/backup_db.pyevery 6 hours - Script fetches data from
/api/by-workstationendpoint - Saves to
backups/database.jsonwith timestamp - Auto-commits and pushes to GitHub if data changed
To create a fresh backup from the production server:
# Pull latest backup from production and save locally
python3 scripts/backup_db.py
# This will:
# 1. Fetch data from https://cc.cengizhan.com/api/by-workstation
# 2. Save to backups/database.json
# 3. Show statistics (X records from Y workstations)
# 4. Commit and push to git (if data changed)To pull the latest production data to your local environment for testing:
# 1. Download latest backup from production
python3 scripts/backup_db.py
# 2. Restore it to your local database
python3 scripts/restore_backup.py backups/database.json
# 3. Run local server
python3 src/main.py
# Now your local environment has all production data
# Visit http://localhost:3003 to see itTo restore your local database from the latest backup (useful for testing with production data):
# Restore from the backup file to local database
python3 scripts/restore_backup.py backups/database.json
# This will:
# 1. Clear your local counts.db database
# 2. Import all records from the backup
# 3. Show statistics (X records imported across Y workstations)Example workflow for local testing:
# 1. Pull latest backup from production
python3 scripts/backup_db.py
# 2. Restore it to your local database
python3 scripts/restore_backup.py backups/database.json
# 3. Run local server to test
python3 src/main.py
# 4. Visit http://localhost:3003 to see production data locallyTo restore the production database from a backup:
# 1. First, create a fresh backup of current production data (safety first!)
python3 scripts/backup_db.py
# 2. Restore production from a specific backup file
python3 scripts/restore_backup.py backups/database.json --upload https://cc.cengizhan.com --secret YOUR_SECRET
# This will:
# 1. Read the backup file
# 2. Upload each record to production via POST /api/set
# 3. Require ABSOLUTELYRIGHT_SECRET for authenticationRecovery scenarios:
- Database corruption: Restore from latest
backups/database.json - Accidental data loss: Find backup from git history before the incident
- Migration to new server: Export backup, deploy new server, restore backup
The backup file (backups/database.json) is structured as an array of workstation objects:
[
{
"workstation_id": "cengizs-MacBook-Pro",
"history": [
{
"day": "2025-10-22",
"absolutely": 4,
"right": 4,
"perfect": 47,
"excellent": 12,
"total_messages": 1057
},
...
]
},
...
]Benefits of this format:
- Human-readable (JSON)
- Version controlled (Git)
- Easy to inspect and verify
- Can be edited manually if needed
- Supports multiple workstations
-
Before major changes: Always create a fresh backup
python3 scripts/backup_db.py git commit -m "Backup before database migration" -
Test restores regularly: Verify backups work
# Test restore to local database python3 scripts/restore_backup.py backups/database.json -
Keep git history: Don't delete old backups from git
- Git preserves all backup versions
- Can rollback to any point in time
- Use
git log backups/database.jsonto see history
-
Monitor backup automation: Check that production backups are running
# Check recent backup commits git log --oneline backups/database.json | head -5 # Should show commits every 6 hours
This is a personal fork, but feel free to:
- Fork this repo for your own tracking
- Submit issues for bugs
- Create PRs for improvements
MIT License - see the original yoavf/absolutelyright repository.
