Skip to content
This repository has been archived by the owner on Nov 4, 2024. It is now read-only.

Feature: maxTokens, model, temperature #11

Open
itai590 opened this issue Jun 8, 2024 · 1 comment
Open

Feature: maxTokens, model, temperature #11

itai590 opened this issue Jun 8, 2024 · 1 comment

Comments

@itai590
Copy link

itai590 commented Jun 8, 2024

I encounter 'Prompt is too large'
The diff is too large for the OpenAI API. Try reducing the number of staged changes or write your own commit message.

In order to solve the issue, please add the maxTokens indicator to the plugin settings.
It will also be nice to have a model and temperature selection.

{
"model": "GPT-4o",
"temperature": 0.9,
"maxTokens": 2048
}

@codeproger
Copy link

Model section is too important for me

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants