A humble attempt at LangGraph in Rust - High-performance agent orchestration framework.
| Feature | LangGraph (Python) | oxidizedgraph |
|---|---|---|
| Parallelism | Limited by GIL | True multi-core |
| Memory per session | ~50MB | ~5MB |
| Startup time | ~200ms | ~10ms |
| Type safety | Runtime | Compile-time |
| Binary size | Needs Python | ~15MB standalone |
[dependencies]
oxidizedgraph = "0.1"
tokio = { version = "1", features = ["full"] }
async-trait = "0.1"use oxidizedgraph::prelude::*;
// Define a simple node
struct ProcessNode;
#[async_trait]
impl NodeExecutor for ProcessNode {
fn id(&self) -> &str { "process" }
async fn execute(&self, state: SharedState) -> Result<NodeOutput, NodeError> {
let mut guard = state.write().unwrap();
guard.set_context("processed", true);
Ok(NodeOutput::cont())
}
}
#[tokio::main]
async fn main() -> anyhow::Result<()> {
// Build the graph
let graph = GraphBuilder::new()
.add_node(ProcessNode)
.set_entry_point("process")
.add_edge_to_end("process")
.compile()?;
// Execute
let runner = GraphRunner::with_defaults(graph);
let result = runner.invoke(AgentState::new()).await?;
println!("Processed: {:?}", result.get_context::<bool>("processed"));
Ok(())
}State flows through the graph between nodes. The built-in AgentState provides common fields:
pub struct AgentState {
pub messages: Vec<Message>, // Conversation history
pub tool_calls: Vec<ToolCall>, // Pending tool calls
pub context: HashMap<String, Value>, // Arbitrary key-value storage
pub iteration: usize, // Current iteration count
pub is_complete: bool, // Completion flag
}Nodes implement NodeExecutor and transform state:
#[async_trait]
impl NodeExecutor for MyNode {
fn id(&self) -> &str { "my_node" }
async fn execute(&self, state: SharedState) -> Result<NodeOutput, NodeError> {
// Access state
let mut guard = state.write().unwrap();
// Do work...
// Return next action
Ok(NodeOutput::cont()) // Continue to next node via edges
// or Ok(NodeOutput::finish()) // End execution
// or Ok(NodeOutput::continue_to("specific_node")) // Route to specific node
}
}Edges connect nodes. They can be direct or conditional:
GraphBuilder::new()
// Direct edge
.add_edge("node_a", "node_b")
// Edge to END
.add_edge_to_end("node_b")
// Conditional edge
.add_conditional_edge("agent", |state| {
if state.is_complete {
transitions::END.to_string()
} else {
"continue".to_string()
}
})EchoNode- Stores a message in contextDelayNode- Adds a configurable delayStaticTransitionNode- Always routes to a fixed targetContextRouterNode- Routes based on context valuesConditionalNode- Routes based on a predicateFunctionNode- Create nodes from closuresLLMNode- Call LLM providersToolNode- Execute pending tool calls
Execute your graph with configurable options:
let runner = GraphRunner::new(
graph,
RunnerConfig::default()
.max_iterations(100)
.verbose(true)
.tag("my-workflow"),
);
let result = runner.invoke(initial_state).await?;Run the included examples:
# Simple linear workflow
cargo run --example simple_workflow
# ReAct agent pattern
cargo run --example react_agent- Core graph primitives
- State management (AgentState)
- NodeExecutor trait
- Conditional edges
- GraphRunner execution
- Built-in nodes (LLM, Tool, Conditional, Function)
- Checkpointing (SQLite, Postgres)
- LLM integrations (Anthropic, OpenAI)
- Streaming execution
- WASM compilation
- Python bindings (PyO3)
Apache-2.0 License - see LICENSE file.
Contributions welcome! Please open an issue or PR.