Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Extending YAI's Backend LLM #79

Open
fire17 opened this issue Dec 3, 2023 · 8 comments
Open

Extending YAI's Backend LLM #79

fire17 opened this issue Dec 3, 2023 · 8 comments
Labels
enhancement New feature or request

Comments

@fire17
Copy link

fire17 commented Dec 3, 2023

Hi there!
First of all let me say thanks! yai is my goto cli ai so far, and i like it very much :)

I wanted to share with you this reletively unknown repo, which i would love for you to support
https://github.com/Ruu3f/perplexityai

it uses perplexity as a backend and it works fantastically out of the box, free, no api keys needed
(and have not hit any limits after a lot of usage)

would really love it if in the app, or even just in the yai.json... Add

{
  "openai_key": "openai" OR "perplexity" or "OTHER"
}

And ideally make it generic and easier in the future to use other apis,
but would love guidance on how to integrate @Ruu3f /perplexityai into yai
am willing to put in the pull request if you point me to the right direction @ekkinox

Thanks a lot!
all the best!

@ekkinox
Copy link
Owner

ekkinox commented Dec 5, 2023

Hey 👋

Thanks for your interrest in Yai :)

As I see, there is an unofficial go lib to interact with perplexity, but seems it has been archived just today (no idea why)

From the python example you linked, seems we interact with this engine via websockets.

So this should be possible, I'll put this issue as enhancement idea 👍

@ekkinox ekkinox added the enhancement New feature or request label Dec 5, 2023
@jaredmontoya
Copy link

As far as I know litellm supports perplexity AI so if you merge PR #92 litellm can be used as a proxy for perplexity AI.
Other backends like Ollama that support openai API format will also be available.

@fire17
Copy link
Author

fire17 commented Feb 29, 2024

Thanks @jaredmontoya :)
I hope there'll be a more generic way to include other models
for example i just got groq api keys, and would love to use it through YAI
@ekkinox what do you think?

Since YAI is written in go, i looked for a groq-go imp but i only found this
https://github.com/groq/groqapi-go-proto , maybe you can make it work

but i think the easiest thing will be if you could call external py functions from go
that way it will still be built the way you like, but would have access to all the standard tools/models available

Thanks and all the best

@jaredmontoya
Copy link

jaredmontoya commented Feb 29, 2024

I like that you thought of a way to avoid cramming dependencies into the binary but python user code is not that different of a solution than just running Litellm that is also written in python.
It's almost the same but you lose the support for ALL LLM backends that Litellm developers constantly add as soon as they appear. I was surprised but Litellm supports groq too.

Litellm is basically a band aid that works for everything that refuses to adopt OpenAI API format. In my opinion everyone should eventually adopt OpenAI API specification because of the wide range of software support and as a result compatibility between everything. And anyone who refuses to do so should not expect any usage of their service.

Adding support for something non-generic only promotes others to make something non-generic as well and just hope that it's also supported by everyone which then bloats all software that tries to support all available backends.

Litellm's goal is to become the sacrifice that will be bloated beyond comprehension instead of all other projects to achieve compatibility with everything.

In a nutshell, I think it would be better:
to force groq and others to add OpenAI API support as that's the only thing that is relatively open about OpenAI and has everything you could possibly need to use LLM's

than:
to force all open source projects that use AI and have unpaid people developing them to add support for every single entitled company's API while also either adding dependencies like python or increasing binary size.

@erkkimon
Copy link

erkkimon commented Apr 10, 2024

I am also interested in contributing. Especially Ollama's built-in OpenAI compatibility seems like the lowest hanging fruit. I'm willing to contribute if someone points me (us) to the right direction.

@jaredmontoya
Copy link

@erkkimon for the time being you can build this program from source on PR #92 and set http://localhost:11434 as the openai_proxy in the ~/.config/yai.json like this:

{
  "openai_key": "sk-xxxxxxxxx",
  "openai_model": "mistral:7b",
  "openai_proxy": "http://localhost:11434",
  "openai_temperature": 0.2,
  "openai_max_tokens": 1000,
  "user_default_prompt_mode": "exec",
  "user_preferences": ""
}

As for custom prompts, you will either have to edit hard coded strings or extend the existing config system for configuring prompts.

I don't know if it's just me, but it feels like nothing changed since the last time I looked at this repository. I hope this project is not abandoned.

@jaredmontoya
Copy link

jaredmontoya commented Apr 17, 2024

@erkkimon I just found out that lexido added ollama support so if you are still interested you can check it out. I would even say that it's better than yai.

@zamo-icm
Copy link

@erkkimon for the time being you can build this program from source on PR #92 and set http://localhost:11434 as the openai_proxy in the ~/.config/yai.json like this:

{
  "openai_key": "sk-xxxxxxxxx",
  "openai_model": "mistral:7b",
  "openai_proxy": "http://localhost:11434",
  "openai_temperature": 0.2,
  "openai_max_tokens": 1000,
  "user_default_prompt_mode": "exec",
  "user_preferences": ""
}

As for custom prompts, you will either have to edit hard coded strings or extend the existing config system for configuring prompts.

I don't know if it's just me, but it feels like nothing changed since the last time I looked at this repository. I hope this project is not abandoned.

I tried this and it is not working. how did you get it to work

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

5 participants