Every AI session starts from zero. Not anymore.
Give your AI coding agent persistent memory — local, automatic, private.
curl -fsSL statelessagent.com/install.sh | bashWindows (PowerShell)
irm statelessagent.com/install.ps1 | iexIf blocked by execution policy, run first: Set-ExecutionPolicy RemoteSigned -Scope CurrentUser
Manual install (or have your AI do it)
If you'd rather not pipe to bash, or you're having an AI assistant install for you:
macOS (Apple Silicon):
mkdir -p ~/.local/bin
curl -fsSL https://github.com/sgx-labs/statelessagent/releases/latest/download/same-darwin-arm64 -o ~/.local/bin/same
chmod +x ~/.local/bin/same
export PATH="$HOME/.local/bin:$PATH" # add to ~/.zshrc to persist
same init --yesmacOS (Intel):
mkdir -p ~/.local/bin
curl -fsSL https://github.com/sgx-labs/statelessagent/releases/latest/download/same-darwin-amd64 -o ~/.local/bin/same
chmod +x ~/.local/bin/same
export PATH="$HOME/.local/bin:$PATH"
same init --yesLinux (x86_64):
mkdir -p ~/.local/bin
curl -fsSL https://github.com/sgx-labs/statelessagent/releases/latest/download/same-linux-amd64 -o ~/.local/bin/same
chmod +x ~/.local/bin/same
export PATH="$HOME/.local/bin:$PATH"
same init --yesBuild from source (any platform):
git clone --depth 1 https://github.com/sgx-labs/statelessagent.git
cd statelessagent && make install
same init --yesRequires Ollama for local embeddings (or configure OpenAI).
- Surfaces the right context, not all context — semantic search finds the notes that matter. No manual copy-pasting, no token waste.
- Learns what helps — notes the agent actually uses get boosted for future sessions (feedback loop).
- Extracts decisions and generates handoffs — decisions get logged, session summaries get created, so the next session picks up where you left off.
- Runs entirely on your machine — Ollama embeddings + SQLite by default. No cloud, no API keys, no accounts. Optional OpenAI embeddings if you prefer.
Your Notes → Ollama → SQLite → Agent Remembers
(.md) (embed) (search) (hooks / MCP)
cd ~/my-notes
same initOne command checks Ollama, finds your notes, indexes them, generates config, and sets up integrations.
| Tool | Integration |
|---|---|
| Claude Code | Hooks + MCP |
| Cursor | MCP |
| Windsurf | MCP |
| Obsidian | Vault detection |
| Logseq | Vault detection |
| Any MCP client | 6 tools |
SAME works with any directory of .md files. No Obsidian required.
Use same init --mcp-only to skip Claude Code hooks and just register the MCP server.
Go · SQLite + sqlite-vec · Ollama / OpenAI
CLI Reference
| Command | Description |
|---|---|
same init |
Interactive setup wizard |
same status |
At-a-glance system status |
same log |
Recent SAME activity |
same search <query> |
Search from the command line |
same related <path> |
Find similar notes |
same reindex [--force] |
Re-index markdown files |
same doctor |
System health check with fix suggestions |
same repair |
Back up database and force-rebuild index |
same feedback <path> up|down |
Boost or penalize a note's retrieval confidence |
same display full|compact|quiet |
Control output verbosity |
same profile use precise|balanced|broad |
Adjust precision vs coverage |
same config show |
Show effective configuration |
same config edit |
Open config in $EDITOR |
same setup hooks |
Install/update Claude Code hooks |
same setup mcp |
Register MCP server |
same stats |
Index statistics |
same watch |
Auto-reindex on file changes |
same budget |
Context utilization report |
same vault list|add|remove |
Manage multiple vaults |
same version [--check] |
Version and update check |
Configuration
SAME uses .same/config.toml, generated by same init:
[vault]
path = "/home/user/notes"
# skip_dirs = [".venv", "build"]
# noise_paths = ["experiments/", "raw_outputs/"]
handoff_dir = "sessions"
decision_log = "decisions.md"
[ollama]
url = "http://localhost:11434"
[embedding]
provider = "ollama" # "ollama" (default) or "openai"
model = "nomic-embed-text" # or "text-embedding-3-small" for openai
# api_key = "" # required for openai, or set SAME_EMBED_API_KEY
[memory]
max_token_budget = 800
max_results = 2
distance_threshold = 16.2
composite_threshold = 0.65
[hooks]
context_surfacing = true
decision_extractor = true
handoff_generator = true
feedback_loop = true
staleness_check = trueConfiguration priority (highest wins):
- CLI flags (
--vault) - Environment variables (
VAULT_PATH,OLLAMA_URL,SAME_*) - Config file (
.same/config.toml) - Built-in defaults
| Variable | Default | Description |
|---|---|---|
VAULT_PATH |
auto-detect | Path to your markdown notes |
OLLAMA_URL |
http://localhost:11434 |
Ollama API (must be localhost) |
SAME_DATA_DIR |
<vault>/.same/data |
Database location |
SAME_HANDOFF_DIR |
sessions |
Handoff notes directory |
SAME_DECISION_LOG |
decisions.md |
Decision log path |
SAME_EMBED_PROVIDER |
ollama |
Embedding provider (ollama or openai) |
SAME_EMBED_MODEL |
nomic-embed-text |
Embedding model name |
SAME_EMBED_API_KEY |
(none) | API key (required for openai provider) |
SAME_SKIP_DIRS |
(none) | Extra dirs to skip (comma-separated) |
SAME_NOISE_PATHS |
(none) | Paths filtered from context surfacing (comma-separated) |
MCP Server
SAME exposes 6 tools via MCP:
| Tool | Description |
|---|---|
search_notes |
Semantic search |
search_notes_filtered |
Search with domain/workstream/tag filters |
get_note |
Read full note by path |
find_similar_notes |
Find related notes |
reindex |
Re-index the vault |
index_stats |
Index statistics |
Display Modes
Control how much SAME shows when surfacing context:
| Mode | Command | Description |
|---|---|---|
| full | same display full |
Box with note titles, match terms, token counts (default) |
| compact | same display compact |
One-line summary: "surfaced 2 of 847 memories" |
| quiet | same display quiet |
Silent — context injected with no visual output |
Display mode is saved to .same/config.toml and takes effect on the next prompt.
Push Protection (Guard)
Prevent accidental git pushes when running multiple AI agents on the same machine.
# Enable push protection
same guard settings set push-protect on
# Before pushing, explicitly allow it
same push-allow
# Check guard status
same guard statusWhen enabled, a pre-push git hook blocks pushes unless a one-time ticket has been created via same push-allow. Tickets expire after 30 seconds by default (configurable via same guard settings set push-timeout N).
Troubleshooting
"No vault found" SAME can't find your notes directory. Fix:
- Run
same initfrom inside your notes folder - Or set
VAULT_PATH=/path/to/notesin your environment - Or use
same vault add myproject /path/to/notes
"Ollama not responding" The embedding provider is unreachable. Fix:
- Check if Ollama is running (look for the llama icon)
- Test with:
curl http://localhost:11434/api/tags - If using a non-default port, set
OLLAMA_URL=http://localhost:<port>
Hooks not firing Context isn't being surfaced during Claude Code sessions. Fix:
- Run
same setup hooksto reinstall hooks - Verify with
same status(hooks should show as "active") - Check
.claude/settings.jsonexists in your project
Context not surfacing Hooks fire but no notes appear. Fix:
- Run
same doctorto check all components - Run
same reindexif your notes have changed - Try
same search "your query"to test search directly - Check if display mode is set to "quiet":
same config show
"Cannot open SAME database" The SQLite database is missing or corrupted. Fix:
- Run
same repairto back up and rebuild automatically - Or run
same initto set up from scratch - Or run
same reindex --forceto rebuild the index
Do I need Obsidian?
No. Any directory of .md files works.
Do I need Ollama?
By default, yes — SAME uses Ollama for local embeddings. But you can switch to OpenAI embeddings by setting SAME_EMBED_PROVIDER=openai and SAME_EMBED_API_KEY. Note: switching providers requires reindexing since embedding dimensions differ.
Does it slow down my prompts? 50-200ms. Embedding is the bottleneck — search and scoring take <5ms.
Is my data sent anywhere? SAME is fully local. Context surfaced to your AI tool is sent to that tool's API as part of your conversation, same as pasting it manually.
How much disk space? 5-15MB for a few hundred notes.
Can I use multiple vaults?
Yes. same vault add work ~/work-notes && same vault default work.
- All data stays local — no external API calls except Ollama on localhost
- Ollama URL validated to localhost-only
_PRIVATE/directories excluded from indexing and context surfacing- Snippets scanned for prompt injection patterns before injection
- Path traversal blocked in MCP
get_notetool - Push protection —
same guard settings set push-protect onrequires explicitsame push-allowbefore git push (prevents accidental pushes when running multiple agents)
git clone https://github.com/sgx-labs/statelessagent.git
cd statelessagent && make installRequires Go 1.23+ and CGO.
Buy me a coffee · GitHub Sponsors
BSL 1.1 — free for personal, educational, hobby, research, and evaluation use. Change date: 2030-02-02 (converts to Apache 2.0). See LICENSE.