Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: Add Bedrock Support for /v1/messages API #9159

Open
huangyafei opened this issue Mar 12, 2025 · 4 comments
Open

[Bug]: Add Bedrock Support for /v1/messages API #9159

huangyafei opened this issue Mar 12, 2025 · 4 comments
Assignees
Labels

Comments

@huangyafei
Copy link

What happened?

Previously I had been using the v1/messages endpoint to access the Claude model on bedrock.

Yesterday I upgraded litellm to 1.63.6 and realized that the v1/messages endpoint is no longer available.

{
    "model": "claude-3-5-haiku-20241022",
    "max_tokens": 1024,
    "messages": [
        {
            "role": "user",
            "content": [
                {
                    "type": "text",
                    "text": "hi"
                }
            ]
        }
    ]
}

Relevant log output

{
    "error": {
        "message": "anthropic_messages() missing 1 required positional argument: 'api_key'",
        "type": "None",
        "param": "None",
        "code": "500"
    }
}

Are you a ML Ops Team?

Yes

What LiteLLM version are you on ?

1.63.6

Twitter / LinkedIn details

No response

@huangyafei huangyafei added the bug Something isn't working label Mar 12, 2025
@huangyafei huangyafei changed the title [Bug]: [Bug]: v1/messages endpoint no longaer avaliable Mar 12, 2025
@ishaan-jaff ishaan-jaff self-assigned this Mar 12, 2025
@ishaan-jaff
Copy link
Contributor

I'll be DRI on this, we recently migrated this to support thinking

@ishaan-jaff
Copy link
Contributor

  • @huangyafei are you calling bedrock api and anthropic api ?
  • can I see how you defined it on your config.yaml

@ishaan-jaff ishaan-jaff changed the title [Bug]: v1/messages endpoint no longaer avaliable [Bug]: Add Bedrock Support for /v1/messages API Mar 12, 2025
@huangyafei
Copy link
Author

  • @huangyafei are you calling bedrock api and anthropic api ?你在调用 Bedrock API 和 Anthropic API 吗?
  • can I see how you defined it on your config.yaml能否让我看看你在你的 config.yaml 文件中是如何定义它的?

Here's my config.yaml file:

model_list:
  - model_name: gpt-4o-mini
    litellm_params:
      model: gpt-4o-mini
      api_key: os.environ/OPENAI_API_KEY
  - model_name: gpt-4o
    litellm_params:
      model: gpt-4o
      api_key: os.environ/OPENAI_API_KEY

  - model_name: claude-3-5-haiku-20241022
    litellm_params:
      model: bedrock/anthropic.claude-3-5-haiku-20241022-v1:0
      aws_access_key_id: os.environ/AWS_ACCESS_KEY_ID
      aws_secret_access_key: os.environ/AWS_SECRET_ACCESS_KEY
      region_name: os.environ/AWS_REGION
  - model_name: claude-3-5-sonnet-20241022
    litellm_params:
      model: bedrock/anthropic.claude-3-5-sonnet-20241022-v2:0
      aws_access_key_id: os.environ/AWS_ACCESS_KEY_ID
      aws_secret_access_key: os.environ/AWS_SECRET_ACCESS_KEY
      region_name: os.environ/AWS_REGION
  - model_name: claude-3-7-sonnet-20250219
    litellm_params:
      model: bedrock/anthropic.claude-3-7-sonnet-20250219-v1:0
      aws_access_key_id: os.environ/AWS_ACCESS_KEY_ID
      aws_secret_access_key: os.environ/AWS_SECRET_ACCESS_KEY
      region_name: os.environ/AWS_REGION  

  - model_name: gpt-4o-fallback
    litellm_params:
      model: gpt-4o
      api_key: os.environ/OPENAI_API_KEY

litellm_settings:
  default_fallbacks: ["gpt-4o-fallback"]
  success_callback: ["langfuse"]

router_settings:
  routing_strategy: simple-shuffle
  num_retries: 3
  request_timeout: 10

general_settings:
  master_key: os.environ/LITELLM_MASTER_KEY
  alerting: ["slack"]
  store_model_in_db: true
  store_prompts_in_spend_logs: true

I use the /v1/messages endpoint provided by litellm to access the Claude model on bedrock.

It was working fine before, but after upgrading the litellm version yesterday, I ran into this problem.

@huangyafei
Copy link
Author

Supplement:

I made the attempt:

  1. using the /v1/chat/completions endpoint, the bedrock/claude model can be accessed normally.
  2. replace the bedrock/claude model in config.yaml with the official Anthropic one, and you can access the Claude model via the /v1/messages endpoint.

So the current problem is that only when accessing the bedrock/claude model via the /v1/messages endpoint, we get the issue

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants