fix: Use entire file as context in "Optimize Code" prompt #459
+634
−327
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
✨ Description
Currently, we only use the snippet of code displayed to the user as context for the LLM prompt. This has the downside of making the LLM reason with less information. If we pass the entire file (if we have it) to the LLM prompt, this should improve its reasoning ability and give users the chance to ask follow up questions about portions of the code which might be in the same file, but not in the profiling snippet.
Here's an example of a short code snippet in a large file:
When asked about code in a different part of the same file, the LLM reports it can only reason about the current snippet:
With the changes in this PR, the LLM can now reason about the the entire file. Here's a snippet that's part of a larger file:
And here is the LLM's response to asking about a specific part of the file not found in the snippet:
🧪 How to test?
The unit tests should pass