Option to integrate with Open source LLMs #86
Replies: 3 comments 1 reply
-
Thanks @nirajkmr007! Yes, I'm looking at adding some models from together.ai, as well as more flexibility to plug in any models that match the OpenAI api spec, very soon. |
Beta Was this translation helpful? Give feedback.
-
@nirajkmr007 This has now been released! You can use tons of open source models now via together.ai or openrouter.ai, among others--any provider that is OpenAI-compatible can be used. Check the release notes here for more details: https://github.com/plandex-ai/plandex/releases/tag/cli%2Fv0.9.0 |
Beta Was this translation helpful? Give feedback.
-
I think /v1 is not longer there, try removing it, anyway I am also trying to use ollama with plandex... ollama can be reached from wsl/linux with windows firewall rule and 192.x.x.x ip etc. |
Beta Was this translation helpful? Give feedback.
-
Great Project! Excited to try out more.
Do you have plans to integrate this with open Source LLMs, giving more options than just openai APIs?
Beta Was this translation helpful? Give feedback.
All reactions