Skip to content

Conversation

@GuitaristForEver
Copy link

Summary

Adds native base_url and custom_headers inputs to eliminate the need for env: block scripting when integrating with LLM Gateways and API management solutions.

Problem

Users in discussions #272, #575, and #704 are struggling to integrate with:

  • LLM Gateways (Anthropic's recommended enterprise pattern)
  • API proxies (Portkey, LiteLLM)
  • API management (Azure APIM, AWS API Gateway)

The current workaround requires undiscoverable env: blocks and users don't know the JSON format for headers.

Solution

- uses: anthropics/claude-code-action@v1
  with:
    anthropic_api_key: ${{ secrets.API_KEY }}
    base_url: "https://your-gateway.example.com"
    custom_headers: '{"x-gateway-auth": "${{ secrets.TOKEN }}"}'

Changes

  • action.yml - Add base_url and custom_headers inputs
  • base-action/action.yml - Mirror inputs for standalone usage
  • docs/configuration.md - Add LLM Gateway section with examples
  • docs/cloud-providers.md - Reference new configuration
  • base-action/README.md - Update inputs table
  • base-action/test/validate-env.test.ts - Add test coverage (31 tests pass)

Test Plan

  • YAML syntax verified
  • All new tests pass (31 tests in validate-env.test.ts)
  • CI checks will run on PR (typecheck, format, full test suite)
  • Manual test with Portkey integration
  • Manual test with custom base URL
  • Verify backwards compatibility with env vars

Closes #840


Note to reviewers: This unblocks users in #704, #575, and addresses the pattern requested in #272.

Adds native support for LLM Gateway integration and custom HTTP headers
without requiring env block configuration. Inputs take precedence but
fall back to environment variables for backwards compatibility.

Closes anthropics#840
Related: anthropics#272, anthropics#575, anthropics#704
Mirrors the root action's inputs for standalone base-action usage.
Falls back to env vars for backwards compatibility.
Documents new base_url and custom_headers inputs with examples for
Portkey, LiteLLM, Azure APIM, and other gateway integrations.
When use_bedrock is enabled and base_url is provided, the action now
correctly sets ANTHROPIC_BEDROCK_BASE_URL from inputs.base_url instead
of only setting ANTHROPIC_BASE_URL.

This enables using Bedrock mode with API Management gateways that:
- Route Bedrock-format requests (/bedrock/model/{model}/invoke)
- Use custom HTTP headers for authentication (not AWS credentials)
- Require custom base URLs (e.g., Azure APIM endpoints)

Changes:
- Set ANTHROPIC_BEDROCK_BASE_URL from inputs.base_url when use_bedrock=true
- Skip AWS credential validation when using custom Bedrock endpoints
- Add tests for custom Bedrock endpoint scenarios

Fixes the issue where Bedrock mode ignored inputs.base_url and failed
to send custom headers to API Management gateways.
Enables using Bedrock API format with custom HTTP headers by creating
a local translation proxy that:
1. Receives Anthropic API format from Claude SDK (HTTP requests)
2. Translates to Bedrock API format
3. Forwards to APIM/gateway with custom headers
4. Translates responses back to Anthropic format

This allows:
- use_bedrock: true (Bedrock API format)
- base_url: custom endpoint (APIM gateway)
- custom_headers: authentication and routing headers

Without requiring AWS SDK or AWS credentials.

How it works:
- When use_bedrock + base_url + custom_headers are all present
- Action starts local HTTP proxy on port 8765
- SDK sends to localhost:8765 in Anthropic format
- Proxy translates and forwards to APIM in Bedrock format
- APIM routes to AWS Bedrock backend

This keeps APIM configuration unchanged while supporting custom headers.

Tests: All 143 tests passing including 34 validation tests
Added comprehensive documentation for the Bedrock HTTP proxy feature:

- docs/configuration.md: Detailed explanation with benefits and workflow
- base-action/README.md: Example usage with API Management gateways

Documents how the proxy enables:
- Bedrock API format with custom HTTP headers
- API Management gateway integration (Azure APIM, AWS API Gateway)
- Authentication via custom headers instead of AWS credentials
- Automatic translation between Anthropic and Bedrock API formats
The Anthropic API expects model names like 'claude-sonnet-4-20250514' without
the 'anthropic.' prefix. The prefix is only used in Bedrock model IDs when
making requests, not in response model names. This was causing 'No assistant
message found' errors because the SDK couldn't recognize the model name.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

feat: Add native base_url and custom_headers inputs for LLM Gateway integration

1 participant