Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
-
Updated
Sep 30, 2025 - Python
Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
A model-driven approach to building AI agents in just a few lines of code.
The most accurate document search and store for building AI apps
Evaluate your LLM's response with Prometheus and GPT4 💯
A set of tools that gives agents powerful capabilities.
Claude Code settings, commands and agents for vibe coding
Agent samples built using the Strands Agents SDK.
A website where you can compare every AI Model ✨
An example agent demonstrating streaming, tool use, and interactivity from your terminal. This agent builder can help you to build your own agents and tools.
This MCP server provides documentation about Strands Agents to your GenAI tools, so you can use your favorite AI coding assistant to vibe-code Strands Agents.
Command-line personal assistant using your favorite proprietary or local models with access to over 30+ tools
Documentation for the Strands Agents SDK. A model-driven approach to building AI agents in just a few lines of code.
Customize and Extend Claude Code with ccproxy: Route to OpenAI, Gemini, Qwen, OpenRouter, and Ollama. Gain full control of your Claude Max/Pro Subscription with your own router.
Sales AI agent that talks with your customers, recommend products, book consultations, and process Stripe payments
Add a description, image, and links to the litellm topic page so that developers can more easily learn about it.
To associate your repository with the litellm topic, visit your repo's landing page and select "manage topics."