chore(assistant): Remove model context length override #11
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Motivation and Context
The LLM Client hard codes the context length of the prompt. With current models this default length of 4,096 tokens will quickly lead to the prompt getting truncated. This leads to the model not having the entire context of the existing diagram when making changes.
Details
This change removes the
numCtxfield from the LlmClient, allowing for the default context length of the model to be used.Does this close any open issues? If so, kindly link the relevant issues.
Screenshots (if appropriate)
How were the changes tested
Changes were tested by running the dev server locally with an instance of Ollama running a model with a 32k token context length. Prior to this change the context length would be stuck at 4k tokens.
Checklist
npm run precommitfrom theuidirectory)