You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository was archived by the owner on Dec 16, 2025. It is now read-only.
Using 'models/gemini-1.5-pro-002' is a model for prompt caching fails when creating the cache. To reproduce, run the example code at https://ai.google.dev/gemini-api/docs/caching?lang=python but replace the model name with pro-002
The context caching doc claims: "Note: Context caching is only available for stable models with fixed versions (for example, gemini-1.5-pro-001). You must include the version postfix (for example, the -001 in gemini-1.5-pro-001)."
So I'd expect that pro-002 is a stable version and should work.