Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature request: Integration with other apps #14

Open
ahmedashraf443 opened this issue Oct 20, 2024 · 1 comment
Open

Feature request: Integration with other apps #14

ahmedashraf443 opened this issue Oct 20, 2024 · 1 comment
Labels
enhancement New feature or request

Comments

@ahmedashraf443
Copy link

Thank you so much for your effort and hard work, i followed your guide to the point and got it up and running and the curl commands work and i get responses based on the models.

would you know how to set this up using gptme for example, which requires OPENAI_API_KEY and OPENAI_API_BASE i guess the base will be the same link as the litellm server and i guess also that the key would be the one we use for the authorization header but when i try running any model gptme cant find that and defaults to the original gpt 4.

@ahmedashraf443 ahmedashraf443 added the enhancement New feature or request label Oct 20, 2024
@aginns
Copy link

aginns commented Nov 6, 2024

Some apps require different OPENAI_API_BASE than expected. Have you tried both versions of these?

  1. https://my_litellm_base.run.app
  2. https://my_litellm_base.run.app/chat/completions

I've had no problem using the proxy through my chat apps that require an OpenAI base and key to be set.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants