Zero overhead. Zero compromise. 100% Rust. 100% Agnostic.
β‘οΈ Runs on $10 hardware with <5MB RAM: That's 99% less memory than OpenClaw and 98% cheaper than a Mac mini!
Built by students and members of the Harvard, MIT, and Sundai.Club communities.
Fast, small, and fully autonomous AI assistant infrastructure
Deploy anywhere. Swap anything.
~3.4MB binary Β· <10ms startup Β· 1,017 tests Β· 23+ providers Β· 8 traits Β· Pluggable everything
- ποΈ Ultra-Lightweight: <5MB Memory footprint β 99% smaller than OpenClaw core.
- π° Minimal Cost: Efficient enough to run on $10 Hardware β 98% cheaper than a Mac mini.
- β‘ Lightning Fast: 400X Faster startup time, boot in <10ms (under 1s even on 0.6GHz cores).
- π True Portability: Single self-contained binary across ARM, x86, and RISC-V.
- Lean by default: small Rust binary, fast startup, low memory footprint.
- Secure by design: pairing, strict sandboxing, explicit allowlists, workspace scoping.
- Fully swappable: core systems are traits (providers, channels, tools, memory, tunnels).
- No lock-in: OpenAI-compatible provider support + pluggable custom endpoints.
Local machine quick benchmark (macOS arm64, Feb 2026) normalized for 0.8GHz edge hardware.
| OpenClaw | NanoBot | PicoClaw | ZeroClaw π¦ | |
|---|---|---|---|---|
| Language | TypeScript | Python | Go | Rust |
| RAM | > 1GB | > 100MB | < 10MB | < 5MB |
| Startup (0.8GHz core) | > 500s | > 30s | < 1s | < 10ms |
| Binary Size | ~28MB (dist) | N/A (Scripts) | ~8MB | 3.4 MB |
| Cost | Mac Mini $599 | Linux SBC ~$50 | Linux Board $10 | Any hardware $10 |
Notes: ZeroClaw results measured with
/usr/bin/time -lon release builds. OpenClaw requires Node.js runtime (~390MB overhead). PicoClaw and ZeroClaw are static binaries.
Reproduce ZeroClaw numbers locally:
cargo build --release
ls -lh target/release/zeroclaw
/usr/bin/time -l target/release/zeroclaw --help
/usr/bin/time -l target/release/zeroclaw statusWindows
-
Visual Studio Build Tools (provides the MSVC linker and Windows SDK):
winget install Microsoft.VisualStudio.2022.BuildToolsDuring installation (or via the Visual Studio Installer), select the "Desktop development with C++" workload.
-
Rust toolchain:
winget install Rustlang.Rustup
After installation, open a new terminal and run
rustup default stableto ensure the stable toolchain is active. -
Verify both are working:
rustc --version cargo --version
- Docker Desktop β required only if using the Docker sandboxed runtime (
runtime.kind = "docker"). Install viawinget install Docker.DockerDesktop.
Linux / macOS
-
Build essentials:
- Linux (Debian/Ubuntu):
sudo apt install build-essential pkg-config - Linux (Fedora/RHEL):
sudo dnf groupinstall "Development Tools" && sudo dnf install pkg-config - macOS: Install Xcode Command Line Tools:
xcode-select --install
- Linux (Debian/Ubuntu):
-
Rust toolchain:
curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh
See rustup.rs for details.
-
Verify both are working:
rustc --version cargo --version
Or skip the steps above and install everything (system deps, Rust, ZeroClaw) in a single command:
curl -LsSf https://raw.githubusercontent.com/zeroclaw-labs/zeroclaw/main/scripts/install.sh | bash- Docker β required only if using the Docker sandboxed runtime (
runtime.kind = "docker"). Install via your package manager or docker.com.
Note: The default
cargo build --releaseusescodegen-units=1for compatibility with low-memory devices (e.g., Raspberry Pi 3 with 1GB RAM). For faster builds on powerful machines, usecargo build --profile release-fast.
git clone https://github.com/zeroclaw-labs/zeroclaw.git
cd zeroclaw
cargo build --release --locked
cargo install --path . --force --locked
# Ensure ~/.cargo/bin is in your PATH
export PATH="$HOME/.cargo/bin:$PATH"
# Quick setup (no prompts)
zeroclaw onboard --api-key sk-... --provider openrouter
# Or interactive wizard
zeroclaw onboard --interactive
# Or quickly repair channels/allowlists only
zeroclaw onboard --channels-only
# Chat
zeroclaw agent -m "Hello, ZeroClaw!"
# Interactive mode
zeroclaw agent
# Start the gateway (webhook server)
zeroclaw gateway # default: 127.0.0.1:8080
zeroclaw gateway --port 0 # random port (security hardened)
# Start full autonomous runtime
zeroclaw daemon
# Check status
zeroclaw status
zeroclaw auth status
# Run system diagnostics
zeroclaw doctor
# Check channel health
zeroclaw channel doctor
# Bind a Telegram identity into allowlist
zeroclaw channel bind-telegram 123456789
# Get integration setup details
zeroclaw integrations info Telegram
# Note: Channels (Telegram, Discord, Slack) require daemon to be running
# zeroclaw daemon
# Manage background service
zeroclaw service install
zeroclaw service status
# Migrate memory from OpenClaw (safe preview first)
zeroclaw migrate openclaw --dry-run
zeroclaw migrate openclawDev fallback (no global install): prefix commands with
cargo run --release --(example:cargo run --release -- status).
ZeroClaw now supports subscription-native auth profiles (multi-account, encrypted at rest).
- Store file:
~/.zeroclaw/auth-profiles.json - Encryption key:
~/.zeroclaw/.secret_key - Profile id format:
<provider>:<profile_name>(example:openai-codex:work)
OpenAI Codex OAuth (ChatGPT subscription):
# Recommended on servers/headless
zeroclaw auth login --provider openai-codex --device-code
# Browser/callback flow with paste fallback
zeroclaw auth login --provider openai-codex --profile default
zeroclaw auth paste-redirect --provider openai-codex --profile default
# Check / refresh / switch profile
zeroclaw auth status
zeroclaw auth refresh --provider openai-codex --profile default
zeroclaw auth use --provider openai-codex --profile workClaude Code / Anthropic setup-token:
# Paste subscription/setup token (Authorization header mode)
zeroclaw auth paste-token --provider anthropic --profile default --auth-kind authorization
# Alias command
zeroclaw auth setup-token --provider anthropic --profile defaultRun the agent with subscription auth:
zeroclaw agent --provider openai-codex -m "hello"
zeroclaw agent --provider openai-codex --auth-profile openai-codex:work -m "hello"
# Anthropic supports both API key and auth token env vars:
# ANTHROPIC_AUTH_TOKEN, ANTHROPIC_OAUTH_TOKEN, ANTHROPIC_API_KEY
zeroclaw agent --provider anthropic -m "hello"Every subsystem is a trait β swap implementations with a config change, zero code changes.
| Subsystem | Trait | Ships with | Extend |
|---|---|---|---|
| AI Models | Provider |
23+ providers (OpenRouter, Anthropic, OpenAI, Ollama, Venice, Groq, Mistral, xAI, DeepSeek, Together, Fireworks, Perplexity, Cohere, Bedrock, Astrai, etc.) | custom:https://your-api.com β any OpenAI-compatible API |
| Channels | Channel |
CLI, Telegram, Discord, Slack, Mattermost, iMessage, Matrix, WhatsApp, Webhook | Any messaging API |
| Memory | Memory |
SQLite with hybrid search (FTS5 + vector cosine similarity), Lucid bridge (CLI sync + SQLite fallback), Markdown | Any persistence backend |
| Tools | Tool |
shell, file_read, file_write, memory_store, memory_recall, memory_forget, browser_open (Brave + allowlist), browser (agent-browser / rust-native), composio (optional) | Any capability |
| Observability | Observer |
Noop, Log, Multi | Prometheus, OTel |
| Runtime | RuntimeAdapter |
Native, Docker (sandboxed) | WASM (planned; unsupported kinds fail fast) |
| Security | SecurityPolicy |
Gateway pairing, sandbox, allowlists, rate limits, filesystem scoping, encrypted secrets | β |
| Identity | IdentityConfig |
OpenClaw (markdown), AIEOS v1.1 (JSON) | Any identity format |
| Tunnel | Tunnel |
None, Cloudflare, Tailscale, ngrok, Custom | Any tunnel binary |
| Heartbeat | Engine | HEARTBEAT.md periodic tasks | β |
| Skills | Loader | TOML manifests + SKILL.md instructions | Community skill packs |
| Integrations | Registry | 50+ integrations across 9 categories | Plugin system |
- β
Supported today:
runtime.kind = "native"orruntime.kind = "docker" - π§ Planned, not implemented yet: WASM / edge runtimes
When an unsupported runtime.kind is configured, ZeroClaw now exits with a clear error instead of silently falling back to native.
All custom, zero external dependencies β no Pinecone, no Elasticsearch, no LangChain:
| Layer | Implementation |
|---|---|
| Vector DB | Embeddings stored as BLOB in SQLite, cosine similarity search |
| Keyword Search | FTS5 virtual tables with BM25 scoring |
| Hybrid Merge | Custom weighted merge function (vector.rs) |
| Embeddings | EmbeddingProvider trait β OpenAI, custom URL, or noop |
| Chunking | Line-based markdown chunker with heading preservation |
| Caching | SQLite embedding_cache table with LRU eviction |
| Safe Reindex | Rebuild FTS5 + re-embed missing vectors atomically |
The agent automatically recalls, saves, and manages memory via tools.
[memory]
backend = "sqlite" # "sqlite", "lucid", "markdown", "none"
auto_save = true
embedding_provider = "openai"
vector_weight = 0.7
keyword_weight = 0.3
# backend = "none" uses an explicit no-op memory backend (no persistence)
# Optional for backend = "sqlite": max seconds to wait when opening the DB (e.g. file locked). Omit or leave unset for no timeout.
# sqlite_open_timeout_secs = 30
# Optional for backend = "lucid"
# ZEROCLAW_LUCID_CMD=/usr/local/bin/lucid # default: lucid
# ZEROCLAW_LUCID_BUDGET=200 # default: 200
# ZEROCLAW_LUCID_LOCAL_HIT_THRESHOLD=3 # local hit count to skip external recall
# ZEROCLAW_LUCID_RECALL_TIMEOUT_MS=120 # low-latency budget for lucid context recall
# ZEROCLAW_LUCID_STORE_TIMEOUT_MS=800 # async sync timeout for lucid store
# ZEROCLAW_LUCID_FAILURE_COOLDOWN_MS=15000 # cooldown after lucid failure to avoid repeated slow attemptsZeroClaw enforces security at every layer β not just the sandbox. It passes all items from the community security checklist.
| # | Item | Status | How |
|---|---|---|---|
| 1 | Gateway not publicly exposed | β | Binds 127.0.0.1 by default. Refuses 0.0.0.0 without tunnel or explicit allow_public_bind = true. |
| 2 | Pairing required | β | 6-digit one-time code on startup. Exchange via POST /pair for bearer token. All /webhook requests require Authorization: Bearer <token>. |
| 3 | Filesystem scoped (no /) | β | workspace_only = true by default. 14 system dirs + 4 sensitive dotfiles blocked. Null byte injection blocked. Symlink escape detection via canonicalization + resolved-path workspace checks in file read/write tools. |
| 4 | Access via tunnel only | β | Gateway refuses public bind without active tunnel. Supports Tailscale, Cloudflare, ngrok, or any custom tunnel. |
Run your own nmap:
nmap -p 1-65535 <your-host>β ZeroClaw binds to localhost only, so nothing is exposed unless you explicitly configure a tunnel.
Inbound sender policy is now consistent:
- Empty allowlist = deny all inbound messages
"*"= allow all (explicit opt-in)- Otherwise = exact-match allowlist
This keeps accidental exposure low by default.
Recommended low-friction setup (secure + fast):
- Telegram: allowlist your own
@username(without@) and/or your numeric Telegram user ID. - Discord: allowlist your own Discord user ID.
- Slack: allowlist your own Slack member ID (usually starts with
U). - Mattermost: uses standard API v4. Allowlists use Mattermost user IDs.
- Use
"*"only for temporary open testing.
Telegram operator-approval flow:
- Keep
[channels_config.telegram].allowed_users = []for deny-by-default startup. - Unauthorized users receive a hint with a copyable operator command:
zeroclaw channel bind-telegram <IDENTITY>. - Operator runs that command locally, then user retries sending a message.
If you need a one-shot manual approval, run:
zeroclaw channel bind-telegram 123456789If you're not sure which identity to use:
- Start channels and send one message to your bot.
- Read the warning log to see the exact sender identity.
- Add that value to the allowlist and rerun channels-only setup.
If you hit authorization warnings in logs (for example: ignoring message from unauthorized user),
rerun channel setup only:
zeroclaw onboard --channels-onlyTelegram routing now replies to the source chat ID from incoming updates (instead of usernames),
which avoids Bad Request: chat not found failures.
For non-text replies, ZeroClaw can send Telegram attachments when the assistant includes markers:
[IMAGE:<path-or-url>][DOCUMENT:<path-or-url>][VIDEO:<path-or-url>][AUDIO:<path-or-url>][VOICE:<path-or-url>]
Paths can be local files (for example /tmp/screenshot.png) or HTTPS URLs.
WhatsApp uses Meta's Cloud API with webhooks (push-based, not polling):
-
Create a Meta Business App:
- Go to developers.facebook.com
- Create a new app β Select "Business" type
- Add the "WhatsApp" product
-
Get your credentials:
- Access Token: From WhatsApp β API Setup β Generate token (or create a System User for permanent tokens)
- Phone Number ID: From WhatsApp β API Setup β Phone number ID
- Verify Token: You define this (any random string) β Meta will send it back during webhook verification
-
Configure ZeroClaw:
[channels_config.whatsapp] access_token = "EAABx..." phone_number_id = "123456789012345" verify_token = "my-secret-verify-token" allowed_numbers = ["+1234567890"] # E.164 format, or ["*"] for all
-
Start the gateway with a tunnel:
zeroclaw gateway --port 8080
WhatsApp requires HTTPS, so use a tunnel (ngrok, Cloudflare, Tailscale Funnel).
-
Configure Meta webhook:
- In Meta Developer Console β WhatsApp β Configuration β Webhook
- Callback URL:
https://your-tunnel-url/whatsapp - Verify Token: Same as your
verify_tokenin config - Subscribe to
messagesfield
-
Test: Send a message to your WhatsApp Business number β ZeroClaw will respond via the LLM.
Config: ~/.zeroclaw/config.toml (created by onboard)
api_key = "sk-..."
default_provider = "openrouter"
default_model = "anthropic/claude-sonnet-4-20250514"
default_temperature = 0.7
# Custom OpenAI-compatible endpoint
# default_provider = "custom:https://your-api.com"
# Custom Anthropic-compatible endpoint
# default_provider = "anthropic-custom:https://your-api.com"
[memory]
backend = "sqlite" # "sqlite", "lucid", "markdown", "none"
auto_save = true
embedding_provider = "openai" # "openai", "noop"
vector_weight = 0.7
keyword_weight = 0.3
# backend = "none" disables persistent memory via no-op backend
[gateway]
require_pairing = true # require pairing code on first connect
allow_public_bind = false # refuse 0.0.0.0 without tunnel
[autonomy]
level = "supervised" # "readonly", "supervised", "full" (default: supervised)
workspace_only = true # default: true β scoped to workspace
allowed_commands = ["git", "npm", "cargo", "ls", "cat", "grep"]
forbidden_paths = ["/etc", "/root", "/proc", "/sys", "~/.ssh", "~/.gnupg", "~/.aws"]
[runtime]
kind = "native" # "native" or "docker"
[runtime.docker]
image = "alpine:3.20" # container image for shell execution
network = "none" # docker network mode ("none", "bridge", etc.)
memory_limit_mb = 512 # optional memory limit in MB
cpu_limit = 1.0 # optional CPU limit
read_only_rootfs = true # mount root filesystem as read-only
mount_workspace = true # mount workspace into /workspace
allowed_workspace_roots = [] # optional allowlist for workspace mount validation
[heartbeat]
enabled = false
interval_minutes = 30
[tunnel]
provider = "none" # "none", "cloudflare", "tailscale", "ngrok", "custom"
[secrets]
encrypt = true # API keys encrypted with local key file
[browser]
enabled = false # opt-in browser_open + browser tools
allowed_domains = ["docs.rs"] # required when browser is enabled
backend = "agent_browser" # "agent_browser" (default), "rust_native", "computer_use", "auto"
native_headless = true # applies when backend uses rust-native
native_webdriver_url = "http://127.0.0.1:9515" # WebDriver endpoint (chromedriver/selenium)
# native_chrome_path = "/usr/bin/chromium" # optional explicit browser binary for driver
[browser.computer_use]
endpoint = "http://127.0.0.1:8787/v1/actions" # computer-use sidecar HTTP endpoint
timeout_ms = 15000 # per-action timeout
allow_remote_endpoint = false # secure default: only private/localhost endpoint
window_allowlist = [] # optional window title/process allowlist hints
# api_key = "..." # optional bearer token for sidecar
# max_coordinate_x = 3840 # optional coordinate guardrail
# max_coordinate_y = 2160 # optional coordinate guardrail
# Rust-native backend build flag:
# cargo build --release --features browser-native
# Ensure a WebDriver server is running, e.g. chromedriver --port=9515
# Computer-use sidecar contract (MVP)
# POST browser.computer_use.endpoint
# Request: {
# "action": "mouse_click",
# "params": {"x": 640, "y": 360, "button": "left"},
# "policy": {"allowed_domains": [...], "window_allowlist": [...], "max_coordinate_x": 3840, "max_coordinate_y": 2160},
# "metadata": {"session_name": "...", "source": "zeroclaw.browser", "version": "..."}
# }
# Response: {"success": true, "data": {...}} or {"success": false, "error": "..."}
[composio]
enabled = false # opt-in: 1000+ OAuth apps via composio.dev
# api_key = "cmp_..." # optional: stored encrypted when [secrets].encrypt = true
entity_id = "default" # default user_id for Composio tool calls
[identity]
format = "openclaw" # "openclaw" (default, markdown files) or "aieos" (JSON)
# aieos_path = "identity.json" # path to AIEOS JSON file (relative to workspace or absolute)
# aieos_inline = '{"identity":{"names":{"first":"Nova"}}}' # inline AIEOS JSONZeroClaw uses one provider key (ollama) for both local and remote Ollama deployments:
- Local Ollama: keep
api_urlunset, runollama serve, and use models likellama3.2. - Remote Ollama endpoint (including Ollama Cloud): set
api_urlto the remote endpoint and setapi_key(orOLLAMA_API_KEY) when required. - Optional
:cloudsuffix: model IDs likeqwen3:cloudare normalized toqwen3before the request.
Example remote configuration:
default_provider = "ollama"
default_model = "qwen3:cloud"
api_url = "https://ollama.com"
api_key = "ollama_api_key_here"For detailed configuration of custom OpenAI-compatible and Anthropic-compatible endpoints, see docs/custom-providers.md.
For LLM providers with inconsistent native tool calling (e.g., GLM-5/Zhipu), ZeroClaw ships a Python companion package with LangGraph-based tool calling for guaranteed consistency:
pip install zeroclaw-toolsfrom zeroclaw_tools import create_agent, shell, file_read
from langchain_core.messages import HumanMessage
# Works with any OpenAI-compatible provider
agent = create_agent(
tools=[shell, file_read],
model="glm-5",
api_key="your-key",
base_url="https://api.z.ai/api/coding/paas/v4"
)
result = await agent.ainvoke({
"messages": [HumanMessage(content="List files in /tmp")]
})
print(result["messages"][-1].content)Why use it:
- Consistent tool calling across all providers (even those with poor native support)
- Automatic tool loop β keeps calling tools until the task is complete
- Easy extensibility β add custom tools with
@tooldecorator - Discord bot integration included (Telegram planned)
See python/README.md for full documentation.
ZeroClaw supports identity-agnostic AI personas through two formats:
Traditional markdown files in your workspace:
IDENTITY.mdβ Who the agent isSOUL.mdβ Core personality and valuesUSER.mdβ Who the agent is helpingAGENTS.mdβ Behavior guidelines
AIEOS is a standardization framework for portable AI identity. ZeroClaw supports AIEOS v1.1 JSON payloads, allowing you to:
- Import identities from the AIEOS ecosystem
- Export identities to other AIEOS-compatible systems
- Maintain behavioral integrity across different AI models
[identity]
format = "aieos"
aieos_path = "identity.json" # relative to workspace or absolute pathOr inline JSON:
[identity]
format = "aieos"
aieos_inline = '''
{
"identity": {
"names": { "first": "Nova", "nickname": "N" }
},
"psychology": {
"neural_matrix": { "creativity": 0.9, "logic": 0.8 },
"traits": { "mbti": "ENTP" },
"moral_compass": { "alignment": "Chaotic Good" }
},
"linguistics": {
"text_style": { "formality_level": 0.2, "slang_usage": true }
},
"motivations": {
"core_drive": "Push boundaries and explore possibilities"
}
}
'''| Section | Description |
|---|---|
identity |
Names, bio, origin, residence |
psychology |
Neural matrix (cognitive weights), MBTI, OCEAN, moral compass |
linguistics |
Text style, formality, catchphrases, forbidden words |
motivations |
Core drive, short/long-term goals, fears |
capabilities |
Skills and tools the agent can access |
physicality |
Visual descriptors for image generation |
history |
Origin story, education, occupation |
interests |
Hobbies, favorites, lifestyle |
See aieos.org for the full schema and live examples.
| Endpoint | Method | Auth | Description |
|---|---|---|---|
/health |
GET | None | Health check (always public, no secrets leaked) |
/pair |
POST | X-Pairing-Code header |
Exchange one-time code for bearer token |
/webhook |
POST | Authorization: Bearer <token> |
Send message: {"message": "your prompt"} |
/whatsapp |
GET | Query params | Meta webhook verification (hub.mode, hub.verify_token, hub.challenge) |
/whatsapp |
POST | None (Meta signature) | WhatsApp incoming message webhook |
| Command | Description |
|---|---|
onboard |
Quick setup (default) |
onboard --interactive |
Full interactive 7-step wizard |
onboard --channels-only |
Reconfigure channels/allowlists only (fast repair flow) |
agent -m "..." |
Single message mode |
agent |
Interactive chat mode |
gateway |
Start webhook server (default: 127.0.0.1:8080) |
gateway --port 0 |
Random port mode |
daemon |
Start long-running autonomous runtime |
service install/start/stop/status/uninstall |
Manage user-level background service |
doctor |
Diagnose daemon/scheduler/channel freshness |
status |
Show full system status |
channel doctor |
Run health checks for configured channels |
channel bind-telegram <IDENTITY> |
Add one Telegram username/user ID to allowlist |
integrations info <name> |
Show setup/status details for one integration |
cargo build # Dev build
cargo build --release # Release build (codegen-units=1, works on all devices including Raspberry Pi)
cargo build --profile release-fast # Faster build (codegen-units=8, requires 16GB+ RAM)
cargo test # 1,017 tests
cargo clippy # Lint (0 warnings)
cargo fmt # Format
# Run the SQLite vs Markdown benchmark
cargo test --test memory_comparison -- --nocaptureA git hook runs cargo fmt --check, cargo clippy -- -D warnings, and cargo test before every push. Enable it once:
git config core.hooksPath .githooksIf you see an openssl-sys build error, sync dependencies and rebuild with the repository lockfile:
git pull
cargo build --release --locked
cargo install --path . --force --lockedZeroClaw is configured to use rustls for HTTP/TLS dependencies; --locked keeps the transitive graph deterministic on fresh environments.
To skip the hook when you need a quick push during development:
git push --no-verifyFor high-throughput collaboration and consistent reviews:
- Contribution guide: CONTRIBUTING.md
- PR workflow policy: docs/pr-workflow.md
- Reviewer playbook (triage + deep review): docs/reviewer-playbook.md
- CI ownership and triage map: docs/ci-map.md
- Security disclosure policy: SECURITY.md
If ZeroClaw helps your work and you want to support ongoing development, you can donate here:
A heartfelt thank you to the communities and institutions that inspire and fuel this open-source work:
- Harvard University β for fostering intellectual curiosity and pushing the boundaries of what's possible.
- MIT β for championing open knowledge, open source, and the belief that technology should be accessible to everyone.
- Sundai Club β for the community, the energy, and the relentless drive to build things that matter.
- The World & Beyond πβ¨ β to every contributor, dreamer, and builder out there making open source a force for good. This is for you.
We're building in the open because the best ideas come from everywhere. If you're reading this, you're part of it. Welcome. π¦β€οΈ
MIT β see LICENSE and NOTICE for contributor attribution
See CONTRIBUTING.md. Implement a trait, submit a PR:
- CI workflow guide: docs/ci-map.md
- New
Providerβsrc/providers/ - New
Channelβsrc/channels/ - New
Observerβsrc/observability/ - New
Toolβsrc/tools/ - New
Memoryβsrc/memory/ - New
Tunnelβsrc/tunnel/ - New
Skillβ~/.zeroclaw/workspace/skills/<name>/
ZeroClaw β Zero overhead. Zero compromise. Deploy anywhere. Swap anything. π¦

