Replies: 1 comment
-
|
LLxprt Code isn't a router. The intent is to add subagents and make each subagent have a different config. Presently the web search is hard coded to google but that will be configurable. So in essence it will be similar but more direct and directly controlled. Basically imagine if Claude Code supported this directly vs hacking it in some proxy...that's what we're doing. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
In Claude Code Router, there are many parameters for placing different types of models https://github.com/musistudio/claude-code-router?tab=readme-ov-file#router
defaultis for regular LLMs (235B or higher, can go up to 1T)backgroundis for SLMs (80B or below, ideally around 30B)thinkis for reasoning models, or hybrid models in reasoning mode (more likely for LLMs rather than SLMs)longContextis for situations when the context gets too large, since 131K+ context is the norm, put 200K/262K/1M models herewebSearchwhich people usually recommend Perplexity or Google models, alternative is:onlinein OpenRouterCould LLXPRT have a sort of "router" considering that Gemini may very well be doing it "under the hood"? Is it even worth considering for multi-modal models vs thinking models vs lighter SLMs?
Beta Was this translation helpful? Give feedback.
All reactions