A fun sandbox to setup and test LangGraph graphs. The sandbox integrates a graph discovery service to surface all registered graphs in the graph/ directory. The graph manager compiles discovered graphs and builds an in memory checkpointer for persisting conversations during runtime. A Textual UI allows for graph selection and multi-turn conversations via the graph with thread management.
Tools can be added via the tools/ dir as well as via the mcp_config file. MCP Servers are setup during startup and are made available via the mcp_registry. Langfuse is integrated as a callback for observability by the graph_manager. It is also integrated for prompt management, exiting graphs currently require a prompt key.
uv syncsource .venv/bin/activateCopy the example environment file and configure:
cp env.example .envRequired variables in .env:
OPENAI_API_KEY– Your OpenAI API keyMCP_WORKING_DIR=./data– Directory for MCP filesystem server (defaults to project root)
Optional (for enhanced capabilities):
PERPLEXITY_API_KEY– Your Perplexity API key for web search capabilities via custom toolLANGFUSE_PUBLIC_KEY– Your LangFuse project's public API keyLANGFUSE_SECRET_KEY– Your LangFuse project's secret API keyLANGFUSE_HOST=http://localhost:3000– LangFuse instance URLTWILIO_ACCOUNT_SID- Account SIDTWILIO_AUTH_TOKEN- Auth TokenTWILIO_FROM_NUMBER- Twilio Sender NumberTWILIO_TO_NUMBER- Your number
Copy the example mcp_config file and configure:
cp mcp_config.example.json mcp_config.jsonThe project includes:
- Filesystem Server: Pre-configured and auto-pulls its Docker image when needed. Provides 11 tools for file operations (read, write, edit, search, etc.) that agents can use through natural language.
For observability and tracing:
git clone git@github.com:langfuse/langfuse.git
cd langfuse
docker-compose upConnect to http://localhost:3000 and create a project for API keys.
python main.py