Skip to content

fix(ai-proxy): support Anthropic token field names in openai-base driver#13039

Open
iakuf wants to merge 3 commits intoapache:masterfrom
iakuf:fix/openai-base-support-anthropic-token-fields
Open

fix(ai-proxy): support Anthropic token field names in openai-base driver#13039
iakuf wants to merge 3 commits intoapache:masterfrom
iakuf:fix/openai-base-support-anthropic-token-fields

Conversation

@iakuf
Copy link

@iakuf iakuf commented Feb 26, 2026

Summary

When using openai-compatible provider with Anthropic-format endpoints
(e.g. DeepSeek's /anthropic/v1/messages), the upstream returns
input_tokens/output_tokens instead of prompt_tokens/completion_tokens.

This causes $llm_prompt_tokens and $llm_completion_tokens to always be 0.

Changes

Add fallback support for Anthropic field names in both streaming and
non-streaming paths of openai-base.lua:

  • prompt_tokens = usage.prompt_tokens or usage.input_tokens or 0
  • completion_tokens = usage.completion_tokens or usage.output_tokens or 0

This is fully backward compatible — OpenAI-format responses are unaffected.

When using openai-compatible provider with Anthropic-format endpoints
(e.g. DeepSeek's /anthropic/v1/messages), the response returns
input_tokens/output_tokens instead of prompt_tokens/completion_tokens.

This patch adds fallback support for both field names in both
streaming and non-streaming paths, so token usage statistics work
correctly regardless of which format the upstream LLM returns.

Fixes token stats being 0 when proxying to Anthropic-compatible endpoints.
@dosubot dosubot bot added the size:S This PR changes 10-29 lines, ignoring generated files. label Feb 26, 2026
@iakuf
Copy link
Author

iakuf commented Feb 26, 2026

This PR fixes token usage statistics when using openai-compatible provider
with Anthropic-compatible endpoints (e.g. DeepSeek's /anthropic/v1/messages).
Anthropic format uses input_tokens/output_tokens instead of
prompt_tokens/completion_tokens, causing $llm_prompt_tokens to always be 0.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

discuss size:S This PR changes 10-29 lines, ignoring generated files.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants