Skip to content

Conversation

@jhunzik
Copy link
Collaborator

@jhunzik jhunzik commented Aug 7, 2025

Motivation and Context

The LLM Client hard codes the context length of the prompt. With current models this default length of 4,096 tokens will quickly lead to the prompt getting truncated. This leads to the model not having the entire context of the existing diagram when making changes.

Details

This change removes the numCtx field from the LlmClient, allowing for the default context length of the model to be used.

Does this close any open issues? If so, kindly link the relevant issues.

Screenshots (if appropriate)

How were the changes tested

Changes were tested by running the dev server locally with an instance of Ollama running a model with a 32k token context length. Prior to this change the context length would be stuck at 4k tokens.

Checklist

  • The PR follows the branch and commit styles outlined in the CONTRIBUTING doc
  • (UI changes only) Linting and formatting checks pass locally with the new changes (npm run precommit from the ui directory)

…o allow the full context length supported by the model
@a-asaad a-asaad merged commit f898e48 into codice:main Aug 7, 2025
1 check passed
@jhunzik jhunzik deleted the context-length-removal branch August 7, 2025 20:56
@a-asaad a-asaad added the bug Something isn't working label Aug 18, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

bug Something isn't working

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants