Replies: 1 comment
-
I can't give you an easy solution for now, but I can highlight two things that may be of interest for this: (1) We are tracking the ability to use e.g. a mock OpenAI API server like the one provided by llama.cpp itself here: #209. Unfortunately, due to limited support of the OpenAI functionality supported on their end, it does not work yet, although it is almost there. (2) We have also had internal discussions on adding the ability to expose a query program as an OpenAI-like Chat API endpoint. However, this has not advanced beyond discussions so far. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Is it possible to use the LMQL server as a replacement to Open AI to use with LangChain Chatopenai class?
I'm trying to use a LlamaCpp model (loaded with LMQL server) and i need it to do both LMQL calls and classic Chatopenai calls without the need of loading the model twice separately.
Hope it is possible in some way!
If its not would be possible to implement Openai Api and have Addresses, one for LMQL calls and the other for OpenAi compatible calls?
Beta Was this translation helpful? Give feedback.
All reactions