Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support for max_completion_tokens in OpenAI chat options request #1412

Closed

Conversation

dafriz
Copy link
Contributor

@dafriz dafriz commented Sep 25, 2024

Adds support for max_completion_tokens in OpenAI chat options request - An upper bound for the number of tokens that can be generated for a completion, including visible output tokens and reasoning tokens.

Fixes #1411

An upper bound for the number of tokens that can be generated for a completion, including visible output tokens and reasoning tokens. Replaces max_tokens field which is now deprecated.
@tzolov
Copy link
Contributor

tzolov commented Oct 1, 2024

Thank you for the contribution @dafriz . Much appreciated

@tzolov
Copy link
Contributor

tzolov commented Oct 1, 2024

Rebased and merge at f56ce20

@tzolov tzolov closed this Oct 1, 2024
@tzolov tzolov added this to the 1.0.0-M3 milestone Oct 1, 2024
@tzolov tzolov self-assigned this Oct 1, 2024
tzolov added a commit that referenced this pull request Oct 1, 2024
…chat options

 - Add new option 'maxCompletionTokens' to spring.ai.openai.chat.options
 - Mark 'maxTokens' as deprecated
 - Update documentation to reflect these changes in OpenAI chat configuration

 Related to #1411 and #1412
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Add support for max_completion_tokens in OpenAI chat options request
2 participants