Skip to content

Conversation

@ishaan-jaff
Copy link

This PR adds support for 50+ models with a standard I/O interface using litellm: https://github.com/BerriAI/litellm/

ChatLiteLLM() is integrated into langchain and allows you to call all models using the ChatOpenAI I/O interface
https://python.langchain.com/docs/integrations/chat/litellm

Here's an example of how to use ChatLiteLLM()

ChatLiteLLM(model="gpt-3.5-turbo")
ChatLiteLLM(model="claude-2", temperature=0.3)
ChatLiteLLM(model="command-nightly")
ChatLiteLLM(model="replicate/llama-2-70b-chat:2c1608e18606fad2812020dc541930f2d0495ce32eee50074220b87300bc16e1")

@ishaan-jaff
Copy link
Author

@joshsny @Alec2435 can you please take a look at this PR when possible ? happy to add more docs/tests if this initial commit looks good 😊

@krrishdholakia
Copy link

bump @joshsny @Alec2435

@inferont
Copy link

@ishaan-jaff @krrishdholakia I'm open to a fork, #76

I'm not sure @joshsny and @Alec2435 want to maintain this repository.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants