A lightweight Go library for interacting with various LLM providers with a simple, unified API.
- OpenAI
- Anthropic
- OpenRouter (via OpenAI-compatible API)
- xAI (Grok)
go get github.com/mkozhukh/echoThe NewCommonClient function creates a client that auto-configures providers from API keys:
package main
import (
"context"
"fmt"
"github.com/mkozhukh/echo"
)
func main() {
ctx := context.Background()
// Create client with explicit API keys
client, err := echo.NewCommonClient(map[string]string{
"openai": "your-openai-key",
}, echo.WithModel("openai/gpt-5"))
if err != nil {
panic(err)
}
// Simple call using QuickMessage helper
resp, err := client.Complete(ctx, echo.QuickMessage("Hello, how are you?"))
if err != nil {
panic(err)
}
fmt.Println(resp.Text)
}Pass nil as keys to auto-detect API keys from environment variables:
client, err := echo.NewCommonClient(nil, echo.WithModel("openai/gpt-5"))If you only need a single provider, use the dedicated constructors:
// OpenAI
client := echo.NewOpenAIClient("your-api-key", "gpt-5")
// Anthropic
client := echo.NewAnthropicClient("your-api-key", "claude-sonnet-4-5")
// Google
client := echo.NewGoogleClient("your-api-key", "gemini-2.5-pro")
// xAI (Grok)
client := echo.NewXAIClient("your-api-key", "grok-4-0709")
// Voyage AI (embeddings & reranking)
client := echo.NewVoyageClient("your-api-key", "voyage-4-large")These accept the same CallOption options as NewCommonClient:
client := echo.NewOpenAIClient("your-api-key", "gpt-5",
echo.WithSystemMessage("You are a helpful assistant."),
echo.WithTemperature(0.7),
)Use convenient aliases instead of full model names:
// Quality tiers available for each provider:
// - best: Highest quality model
// - balanced: Good balance of quality and speed
// - light: Fast and economical
client, _ := echo.NewCommonClient(nil, echo.WithModel("openai/best")) // Uses gpt-5.2
client, _ := echo.NewCommonClient(nil, echo.WithModel("anthropic/balanced")) // Uses claude-opus-4-5
client, _ := echo.NewCommonClient(nil, echo.WithModel("google/light")) // Uses gemini-2.5-flash
client, _ := echo.NewCommonClient(nil, echo.WithModel("xai/best")) // Uses grok-4-0709The library supports flexible environment variable configuration:
// Set default model and API key
os.Setenv("ECHO_MODEL", "anthropic/balanced")
os.Setenv("ECHO_KEY", "your-api-key")
// Create client without parameters - uses env vars
client, _ := echo.NewCommonClient(nil)
// Or use provider-specific API keys
os.Setenv("OPENAI_API_KEY", "your-openai-key")
os.Setenv("ANTHROPIC_API_KEY", "your-anthropic-key")
os.Setenv("GOOGLE_API_KEY", "your-google-key")
os.Setenv("XAI_API_KEY", "your-xai-key")
// API key is automatically selected based on provider
client, _ := echo.NewCommonClient(nil, echo.WithModel("openai/gpt-5"))The library supports three ways to create message chains for conversations:
For basic single-message prompts:
resp, _ := client.Complete(ctx, echo.QuickMessage("Tell me a joke"))For readable multi-turn conversations using a text template:
messages := echo.TemplateMessage(`
@system:
You are a helpful math tutor.
@user:
What is 2+2?
@agent:
2+2 equals 4.
@user:
Can you explain why?
`)
resp, err := client.Complete(ctx, messages)Template format:
@role:markers separate messages (system, user, agent)- Content follows until the next marker or end of template
- Content can be on the same line:
@user: Hello there! - Multiline content is supported
- Whitespace is automatically trimmed
For programmatic message building:
messages := []echo.Message{
{Role: echo.System, Content: "You are a helpful assistant."},
{Role: echo.User, Content: "Hello"},
{Role: echo.Agent, Content: "Hi! How can I help you today?"},
{Role: echo.User, Content: "What's the weather like?"},
}
resp, err := client.Complete(ctx, messages)echo.System- System instructions (must be first if present, only one allowed)echo.User- User messagesecho.Agent- Assistant/model messages (maps to "assistant" for OpenAI/Anthropic, "model" for Gemini)
// Set defaults at client creation time
client, _ := echo.NewCommonClient(nil,
echo.WithModel("google/best"),
echo.WithSystemMessage("You are a creative assistant."),
echo.WithTemperature(0.8),
)
// Use client defaults
resp, _ := client.Complete(ctx, echo.QuickMessage("Tell me a joke"))
// Override defaults for specific calls
resp, _ = client.Complete(ctx, echo.QuickMessage("Write a formal email"),
echo.WithTemperature(0.2), // More deterministic
)The library supports switching providers on a per-call basis using WithModel:
// Create a client with a default provider
client, _ := echo.NewCommonClient(nil, echo.WithModel("openai/gpt-4"))
// Use different providers for different calls
resp1, _ := client.Complete(ctx, echo.QuickMessage("Analyze this text"),
echo.WithModel("anthropic/claude-3.5-sonnet"), // Use Anthropic for analysis
)
resp2, _ := client.Complete(ctx, echo.QuickMessage("Generate an image description"),
echo.WithModel("google/gemini-2.5-pro"), // Use Google for creative tasks
)
resp3, _ := client.Complete(ctx, echo.QuickMessage("Quick calculation"),
echo.WithModel("openai/gpt-5-mini"), // Use a lighter model for simple tasks
)resp, err := client.Complete(ctx, echo.QuickMessage("Write a story"),
echo.WithTemperature(0.7),
echo.WithMaxTokens(100),
echo.WithSystemMessage("You are a creative writer."),
)WithModel(string)- Override model for this callWithTemperature(float32)- Control randomness (0.0 - 1.0)WithMaxTokens(int)- Limit response lengthWithSystemMessage(string)- Set or override system prompt (overrides any system message in the message chain)WithBaseURL(string)- Override the API base URL (useful for custom endpoints)WithEndPoint(string)- Specify endpoint routing (primarily for OpenRouter provider selection)WithStoreData(bool)- Control server-side storage (xAI only, defaults to false for privacy)
For real-time streaming of responses, use the StreamComplete method:
streamResp, err := client.StreamComplete(ctx, echo.QuickMessage("Write a short story"))
if err != nil {
panic(err)
}Complete: Returns complete response after generation finishesStreamComplete: Returns chunks as they're generated for real-time display
Both methods support the same options
The "mock" provider can be used for tests, it will return combined string of all incoming messages
client, _ := echo.NewCommonClient(nil, echo.WithModel("mock/any"))
mockResp, err := client.Complete(ctx, echo.QuickMessage("test"))
if err != nil {
panic(err)
}
// outputs: `[user]: test`OpenRouter provides access to multiple LLM providers through a single API:
// Basic usage with any OpenRouter model
client, _ := echo.NewCommonClient(map[string]string{
"openrouter": "your-openrouter-key",
}, echo.WithModel("openrouter/claude-3.5-sonnet"))you can specify which underlying provider infrastructure to use:
// Specify provider routing with @ syntax in model name
client, _ := echo.NewCommonClient(nil, echo.WithModel("openrouter/claude-3.5-sonnet@aws"))
// Multiple providers for fallback (comma-separated)
client, _ := echo.NewCommonClient(nil, echo.WithModel("openrouter/gpt-4@azure,openai"))xAI provides access to Grok models:
// Basic usage
client, _ := echo.NewCommonClient(map[string]string{
"xai": "your-xai-key",
}, echo.WithModel("xai/grok-4-0709"))
// Server-side storage is disabled by default for privacy
// To explicitly enable storage:
resp, _ := client.Complete(ctx, echo.QuickMessage("Hello"),
echo.WithStoreData(true),
)MIT