fix(ai-proxy): support Anthropic token field names in openai-base driver#13039
Open
iakuf wants to merge 3 commits intoapache:masterfrom
Open
fix(ai-proxy): support Anthropic token field names in openai-base driver#13039iakuf wants to merge 3 commits intoapache:masterfrom
iakuf wants to merge 3 commits intoapache:masterfrom
Conversation
When using openai-compatible provider with Anthropic-format endpoints (e.g. DeepSeek's /anthropic/v1/messages), the response returns input_tokens/output_tokens instead of prompt_tokens/completion_tokens. This patch adds fallback support for both field names in both streaming and non-streaming paths, so token usage statistics work correctly regardless of which format the upstream LLM returns. Fixes token stats being 0 when proxying to Anthropic-compatible endpoints.
Author
|
This PR fixes token usage statistics when using openai-compatible provider |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
When using
openai-compatibleprovider with Anthropic-format endpoints(e.g. DeepSeek's
/anthropic/v1/messages), the upstream returnsinput_tokens/output_tokensinstead ofprompt_tokens/completion_tokens.This causes
$llm_prompt_tokensand$llm_completion_tokensto always be 0.Changes
Add fallback support for Anthropic field names in both streaming and
non-streaming paths of
openai-base.lua:prompt_tokens = usage.prompt_tokens or usage.input_tokens or 0completion_tokens = usage.completion_tokens or usage.output_tokens or 0This is fully backward compatible — OpenAI-format responses are unaffected.