Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

LLM Model Field Behavior Issues for OpenAI and MistralAI Providers in Configuration Settings #45

Open
kshitij79 opened this issue Aug 14, 2024 · 0 comments

Comments

@kshitij79
Copy link
Contributor

kshitij79 commented Aug 14, 2024

Bug Report 🐛

The LLM model field in the settings interface exhibits inconsistent behavior depending on the selected provider.
For OpenAI, the field must be left empty to function correctly. Conversely, the MistralAI provider fails to function if the model field is left empty, leading to errors or failed connections.

Expected Behavior

The default LLM model value should be auto-populated when selecting an AI provider.

Current Behavior

Possible Solution

Implement default value auto-population for the LLM model field based on the selected AI provider. Fix the default behaviour for OpenAI and MistralaAI.

Steps to Reproduce

  1. Go to the settings interface.
  2. Select the OpenAI provider and enter an appropriate LLM model name.
  3. Observe that the provider only works when the model field is empty.
  4. Select the MistralAI provider and leave the model field empty.
  5. Observe that the provider fails to function, leading to errors or failed connections.

Context (Environment)

Application

  • VSCode (All versions)

Detailed Description

Possible Implementation

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant