Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Added support for OpenRouter & Generic Models #203

Merged
merged 2 commits into from
Feb 5, 2025

Conversation

rglozman
Copy link
Contributor

Added support for OpenRouter which hosts over 300 models. This includes support for models from every provider into one easy to use API.

I created an example using Gemini (free model hosted by google), updated the .env, and the README.

@chr-hertel
Copy link
Member

Hi, thanks for the idea and I like the GenericModel approach for this platform 👍

Was just creating an API key for openrouter, was running the example you pushed and got this error:

php examples/chat-gemini-openrouter.php 
PHP Fatal error:  Uncaught PhpLlm\LlmChain\Exception\RuntimeException: Response does not contain message in /.../llm-chain/src/Bridge/OpenRouter/Client.php:54
Stack trace:
#0 /.../llm-chain/src/Model/Response/AsyncResponse.php(33): PhpLlm\LlmChain\Bridge\OpenRouter\Client->convert()
#1 /.../llm-chain/src/Chain.php(75): PhpLlm\LlmChain\Model\Response\AsyncResponse->unwrap()
#2 /.../llm-chain/examples/chat-gemini-openrouter.php(26): PhpLlm\LlmChain\Chain->call()
#3 {main}
  thrown in /home/christopher/Projects/LlmChain/llm-chain/src/Bridge/OpenRouter/Client.php on line 54

does it work for you?


btw, this should fix the codestyle issues in the pipeline:

PHP_CS_FIXER_IGNORE_ENV=1 vendor/bin/php-cs-fixer fix

or to run the entire quality pipeline locally:

make qa

@rglozman
Copy link
Contributor Author

rglozman commented Feb 1, 2025

Could you add var_dump($data) just before line 54 in src/Bridge/OpenRouter/Client.php and run again? Inside the if statement just before the "throw new". Could be a bunch of different errors (auth, out of credits, servers down, etc...). I am working on a better way to report errors to the user (for all providers), I will make it a separate pull request.

Yes I will run the linter, thanks for the heads up.

@chr-hertel
Copy link
Member

You are right, just tried it again with dump on the response. First time it worked, but second time it was rejected due to rate limit.

The error handling is something to improve in general. I would love to also have a more generalistic approach, but also the error responses depend on the specific model and platform. so a combination could be worth exploring 🤔 like a better default error handling and an extension point per platform/model that enables bridges to fine-tune specifics of error handling.

@chr-hertel
Copy link
Member

So, all in all, thanks for the PR, happy to merge once the pipeline is green :) 👍

@rglozman
Copy link
Contributor Author

rglozman commented Feb 5, 2025

Should be good to merge now 👍

Copy link
Member

@chr-hertel chr-hertel left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @rglozman! 👍

@chr-hertel chr-hertel merged commit a467199 into php-llm:main Feb 5, 2025
7 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants