Skip to content

Conversation

@otterDeveloper
Copy link

Context

I obtained the data from models.dev and Fireworks model library

  • added new models minimax-m2p1, qwen3-235b-a22b, deepseek-v3-0324, deepseek-v3p2, glm-4p7 with detailed configuration and display names
  • Updated max context, according to customer support the maxTokens a model support is inputTokes + output_tokens = contextWindow
  • Fireworks support input cache, added supportsPromptCache and cacheReadsPrice, for most models.
  • Marked kimi-k2-instruct and deepseek-v3 as deprecated, fireworks no longer supports serverless inference on those models.

Implementation

Edited packages/types/src/providers/fireworks.ts
Ran all typecheks and lints

Screenshots

before after
image image

How to Test

Just go to settings -> providers -> fireworks

- add new models minimax-m2p1, qwen3-235b-a22b, deepseek-v3-0324, deepseek-v3p2,
  glm-4p7 with detailed configuration and display names
- update existing models with corrected maxTokens, contextWindow, support flags,
  pricing, display names, and deprecations
- improve consistency in model metadata and support information across all models
@changeset-bot
Copy link

changeset-bot bot commented Dec 30, 2025

🦋 Changeset detected

Latest commit: 68f027c

The changes in this PR will be included in the next version bump.

This PR includes changesets to release 1 package
Name Type
kilo-code Patch

Not sure what this means? Click here to learn what changesets are.

Click here if you're a maintainer who wants to add another changeset to this PR

@otterDeveloper otterDeveloper changed the title feat (fireworks.ai): add minimax 2.1, glm 4.7, updated other models Fireworks.ai support for minimax 2.1, glm 4.7, update other models Dec 31, 2025
@otterDeveloper otterDeveloper changed the title Fireworks.ai support for minimax 2.1, glm 4.7, update other models Provider Fireworks.ai support for minimax 2.1, glm 4.7, update other models Dec 31, 2025
@kevinvandijk
Copy link
Collaborator

Thank you @otterDeveloper! I'll double check this locally today and merge if all is fine!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants