An Agent Development Kit (ADK) allowing for seamless creation of A2A-compatible agents written in Go
-
Updated
Sep 28, 2025 - Go
An Agent Development Kit (ADK) allowing for seamless creation of A2A-compatible agents written in Go
A2A agent server enabling Google Calendar scheduling, retrieval, and automation
A command-line tool to scaffold and manage enterprise-ready AI Agents powered by the A2A (Agent-to-Agent) protocol
An SDK written in Rust for the Inference Gateway
The UI for the inference-gateway, providing a user-friendly interface to interact with and visualize inference results and manage models
An Agent Development Kit (ADK) allowing for seamless creation of A2A-compatible agents written in Rust
An SDK written in Typescript for the Inference Gateway
A powerful command-line interface for managing and interacting with the Inference Gateway. This CLI provides tools for configuration, monitoring, and management of inference services
This project provides a Kubernetes Operator for managing the lifecycle of the inference-gateway and its related components. It simplifies deployment, configuration, and scaling of the gateway within Kubernetes clusters, enabling seamless integration of inference workflows.
Extensive documentation of the inference-gateway
An Agent Development Kit (ADK) allowing for seamless creation of A2A-compatible agents written in TypeScript
An SDK written in Python for the Inference Gateway
An open-source, high-performance gateway unifying multiple LLM providers, from local solutions like Ollama to major cloud providers such as OpenAI, Groq, Cohere, Anthropic, Cloudflare and DeepSeek.
Add a description, image, and links to the inference-gateway topic page so that developers can more easily learn about it.
To associate your repository with the inference-gateway topic, visit your repo's landing page and select "manage topics."