A comprehensive, production-ready Rust SDK for the SerpAPI service that provides ergonomic APIs, type safety, and async-first design.
π Developed during the Realtime Search AI Hackathon (Hybrid) powered by SerpAPI and organized by AI Tinkerers Paris
- π¦ Type-safe: Strongly typed request builders and response structures
- β‘ Async/await: Built on tokio with efficient async I/O
- π― Ergonomic: Fluent builder APIs for constructing queries
- π Resilient: Automatic retry logic with exponential backoff
- π Streaming: Support for paginated result streaming
- π Production-ready: Comprehensive error handling and logging
- π Specialized: Built-in support for images, news, videos, shopping, and local search
Add this to your Cargo.toml:
[dependencies]
serp-sdk = "0.1"
tokio = { version = "1.0", features = ["full"] }- Crates.io - Official package registry
- Docs.rs - Auto-generated API documentation
- GitHub Repository - Source code and examples
To generate local documentation:
cargo doc --all-features --no-deps --openuse serp_sdk::{SerpClient, SearchQuery};
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
// Initialize client (API key from env var SERP_API_KEY or builder)
let client = SerpClient::builder()
.api_key("your-serp-api-key")
.build()?;
// Build and execute search
let results = client.search(
SearchQuery::new("Rust programming language")
.language("en")
.country("us")
.limit(10)?
).await?;
// Process results
if let Some(organic) = results.organic_results {
for result in organic {
println!("{}: {}", result.title, result.link);
}
}
Ok(())
}Stream paginated results for large queries:
use futures::StreamExt;
use serp_sdk::{SerpClient, SearchQuery, StreamConfig};
let mut stream = client.search_stream(
SearchQuery::new("rust tutorials"),
StreamConfig::new()
.page_size(20)?
.max_pages(5)
.delay(std::time::Duration::from_millis(500))
);
while let Some(page) = stream.next().await {
match page {
Ok(results) => println!("Got page with {} results",
results.organic_results.as_ref().map_or(0, |r| r.len())),
Err(e) => eprintln!("Error: {}", e),
}
}let images = client.search(
SearchQuery::new("rust logo").images()
).await?;let news = client.search(
SearchQuery::new("rust programming").news()
).await?;let videos = client.search(
SearchQuery::new("rust tutorial").videos()
).await?;let products = client.search(
SearchQuery::new("rust book").shopping()
).await?;let local = client.search(
SearchQuery::new("rust meetup")
.location("San Francisco, CA")
).await?;let results = client.search(
SearchQuery::new("site:github.com rust web framework")
.language("en")
.country("us")
.device("desktop")
.safe_search("off")
.domain("google.com")
.limit(50)?
.offset(10)
).await?;The SDK provides comprehensive error handling with the SerpError enum:
match client.search(query).await {
Ok(results) => {
// Process results
}
Err(SerpError::RateLimited { retry_after }) => {
println!("Rate limited, retry after {} seconds", retry_after);
}
Err(SerpError::ApiError { code, message }) => {
println!("API error {}: {}", code, message);
}
Err(SerpError::MissingApiKey) => {
println!("Please set SERP_API_KEY environment variable");
}
Err(e) => {
println!("Other error: {}", e);
}
}use serp_sdk::RetryPolicy;
use std::time::Duration;
let client = SerpClient::builder()
.api_key("your-key")
.retry_policy(
RetryPolicy::new(5) // Max 5 retries
.with_base_delay(Duration::from_millis(200))
.with_max_delay(Duration::from_secs(30))
.with_backoff_multiplier(2.0)
)
.build()?;SERP_API_KEY: Your SerpAPI key (if not provided via builder)
let client = SerpClient::builder()
.api_key("your-key")
.base_url("https://serpapi.com") // Custom base URL
.timeout(Duration::from_secs(30)) // Request timeout
.user_agent("my-app/1.0") // Custom User-Agent
.default_header("X-Custom", "value")? // Add custom headers
.build()?;The SDK provides strongly-typed response structures:
SearchResults: Complete search responseOrganicResult: Individual organic search resultAnswerBox: Featured snippet/answer boxKnowledgeGraph: Knowledge panel informationNewsResult: News article resultVideoResult: Video search resultShoppingResult: Shopping/product resultLocalPlace: Local business result
The repository includes comprehensive examples:
basic_search.rs: Basic search functionalitystreaming.rs: Streaming and paginationspecialized_search.rs: Different search types
Run examples with:
# Set your API key
export SERP_API_KEY="your-serp-api-key"
# Run basic search example
cargo run --example basic_search
# Run streaming example
cargo run --example streaming
# Run specialized search example
cargo run --example specialized_searchRun the test suite:
cargo testRun with logging:
RUST_LOG=debug cargo teststreaming: Enable streaming support (enabled by default)mcp: Enable MCP (Model Context Protocol) integration
Contributions are welcome! Please feel free to submit a Pull Request.
- MIT license (LICENSE-MIT or http://opensource.org/licenses/MIT)
The SDK is designed for high performance with minimal overhead:
- Query Building: ~54ns for simple queries, ~113ns for complex queries
- HTTP Client: Built on reqwest with connection pooling and keepalive
- Memory Efficient: Streaming support prevents large result sets from consuming excessive memory
- Zero-Cost Abstractions: Leverages Rust's type system for compile-time guarantees
This project is supported by our generous sponsors:
- AI Tinkerers Paris - For organizing the hackathon and fostering AI innovation
- SerpAPI - For providing the excellent search API service and sponsoring the hackathon
- The Rust community for exceptional async and HTTP libraries
- All contributors who help improve this project
This SDK was developed by:
- Hamze Ghalebi - Developer
- Reetika Gautam - Developer
- Leon Carlo - Developer
π See ROADMAP.md for detailed implementation plans
This SDK is evolving into a comprehensive AI-powered search infrastructure through three strategic phases:
-
π― Rig Integration (Q1 2026): Transform the SDK into an intelligent search layer for LLM applications, enabling RAG pipelines, semantic search, and AI agent tools.
-
ποΈ PostgreSQL Integration (Q2 2026): Add persistent caching, search analytics, and query optimization with database-backed storage for enterprise-scale deployments.
-
π MCP Server (Q3 2026): Expose search capabilities to AI assistants through the Model Context Protocol, enabling multi-assistant collaboration.