-
Notifications
You must be signed in to change notification settings - Fork 3
Description
Summary
Add support for an azure_anthropic backend that enables using Anthropic models (Claude) hosted on Azure AI Foundry.
Motivation
Azure AI Foundry / Azure AI Model Catalog now offers Anthropic models as a managed service. Organizations using Azure often need to route API calls through their Azure endpoints for compliance, billing, and network policy reasons. The existing anthropic backend only targets api.anthropic.com, and the existing azure_openai backend only works with OpenAI models on Azure.
Proposed Implementation
The Anthropic Python SDK already supports a base_url parameter, so the implementation is straightforward — a thin wrapper around the existing AnthropicClient that reads Azure-specific config:
New file: rlm/rlm/clients/azure_anthropic.py
- Mirrors the structure of
azure_openai.pyandanthropic.py - Reads env vars:
AZURE_ANTHROPIC_API_KEY,AZURE_ANTHROPIC_ENDPOINT,AZURE_ANTHROPIC_API_VERSION - Passes
base_url=azure_endpointtoanthropic.Anthropic() - Full usage tracking, sync/async support, system prompt extraction
Changes to existing files:
rlm/rlm/clients/__init__.py— addazure_anthropicbranch toget_client()rlm/rlm/core/types.py— add"azure_anthropic"toClientBackendLiteral
Usage:
export AZURE_ANTHROPIC_API_KEY="your-key"
export AZURE_ANTHROPIC_ENDPOINT="https://your-resource.services.ai.azure.com/v1"
rlm ask . -q "Summarize this repo" --backend azure_anthropic --model claude-opus-4-6Or via config:
backend: azure_anthropic
model: claude-opus-4-6
backend_kwargs:
azure_endpoint: https://your-resource.services.ai.azure.com/v1Reference Implementation
I have a working implementation at https://github.com/rawwerks/rlm-cli — happy to open a PR if this approach looks good.