Skip to content

fix: skip prompt_cache_key for Gemini provider#796

Open
friday-james wants to merge 2 commits intosipeed:mainfrom
friday-james:fix/prompt-cache-key-gemini-400
Open

fix: skip prompt_cache_key for Gemini provider#796
friday-james wants to merge 2 commits intosipeed:mainfrom
friday-james:fix/prompt-cache-key-gemini-400

Conversation

@friday-james
Copy link

@friday-james friday-james commented Feb 26, 2026

Summary

  • The prompt_cache_key field is accepted by most OpenAI-compatible providers, but Gemini rejects unknown fields in the request body, causing a 400 error: Unknown name "prompt_cache_key": Cannot find field
  • Added a supportPromptCache flag to the HTTP provider, enabled for all providers except Gemini

Changes

  • pkg/providers/openai_compat/provider.go: Added supportPromptCache field and setter; gated prompt_cache_key on the flag
  • pkg/providers/http_provider.go: Exposed SetSupportPromptCache on the wrapper
  • pkg/providers/factory_provider.go: Enabled the flag for all HTTP providers except gemini

Test plan

  • picoclaw agent -m "test" --model gemini-2.0-flash — should no longer get 400 error
  • picoclaw agent -m "test" --model <other-model> — should still send prompt_cache_key

🤖 Generated with Claude Code

Ubuntu and others added 2 commits February 26, 2026 03:00
The prompt_cache_key field is an OpenAI-specific feature for prefix-based
prompt caching. When sent to other providers like Gemini that use the
OpenAI-compatible HTTP provider, it causes a 400 error because Gemini's
API rejects unknown fields.

This adds a supportPromptCache flag to the provider struct, enabled only
for the OpenAI protocol, so the field is no longer sent to providers
that don't support it.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
The prompt_cache_key field is accepted by most OpenAI-compatible
providers, but Gemini rejects unknown fields in the request body,
causing a 400 error.

This adds a supportPromptCache flag to the HTTP provider, enabled for
all providers except Gemini.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
@friday-james friday-james changed the title fix: only send prompt_cache_key for OpenAI provider fix: skip prompt_cache_key for Gemini provider Feb 26, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant