-
-
Notifications
You must be signed in to change notification settings - Fork 153
fix systemprompt bug #210
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
fix systemprompt bug #210
Conversation
fix systemprompt bug
Reviewer's guide (collapsed on small PRs)Reviewer's GuideRefactors message assembly and send flows to ensure system prompts are always included and to streamline state handling, preventing stale message arrays. Sequence diagram for sending a chat message with system promptsequenceDiagram
actor User
participant LLMChatInterface
participant llmChatService
User->>LLMChatInterface: Submit chat message
LLMChatInterface->>LLMChatInterface: buildLLMMessages([...messages, userMessage])
LLMChatInterface->>llmChatService: sendMessage(selectedProvider, messagesWithSystemPrompt, ...)
llmChatService-->>LLMChatInterface: Response chunk(s)
LLMChatInterface->>LLMChatInterface: Update messages state
Class diagram for Message and buildLLMMessages changesclassDiagram
class Message {
+id: string
+session_id: string
+content: string
+sender: string
+timestamp: string
}
class LLMChatInterface {
+buildLLMMessages(userMessages: Message[]): Message[]
}
LLMChatInterface --> Message: uses
File-Level Changes
Tips and commandsInteracting with Sourcery
Customizing Your ExperienceAccess your dashboard to:
Getting Help
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hey @boston008 - I've reviewed your changes - here's some feedback:
- Instead of prepending the system prompt content onto the first user message, consider sending it as a distinct system message with a
systemsender to preserve clear message semantics and avoid confusing the LLM or downstream logic. - Revert to using the functional state update form (
setMessages(prev => [...prev, userMessage])) when appending messages to avoid potential stale state issues under rapid consecutive updates. - Remove or conditionally gate the
console.debugstatements to prevent internal logs from cluttering production consoles.
Prompt for AI Agents
Please address the comments from this code review:
## Overall Comments
- Instead of prepending the system prompt content onto the first user message, consider sending it as a distinct system message with a `system` sender to preserve clear message semantics and avoid confusing the LLM or downstream logic.
- Revert to using the functional state update form (`setMessages(prev => [...prev, userMessage])`) when appending messages to avoid potential stale state issues under rapid consecutive updates.
- Remove or conditionally gate the `console.debug` statements to prevent internal logs from cluttering production consoles.Help me be more useful! Please click 👍 or 👎 on each comment and I'll use the feedback to improve your reviews.
buildLLMMessages
|
@boston008 it'd be great if you could use the status to indicate the PR's status. That said, if it's still a WIP, you can switch to draft instead of keeping it open. that way I'll know you are ready to review. thanks! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hey @boston008 - I've reviewed your changes and they look great!
Prompt for AI Agents
Please address the comments from this code review:
## Individual Comments
### Comment 1
<location> `web/src/pages/chat/llm-chat-interface.tsx:402` </location>
<code_context>
+ session_id: sessionId || '',
content: systemPrompt,
- sender: 'system' as const,
+ sender: 'user',
timestamp: new Date().toISOString(),
};
</code_context>
<issue_to_address>
System prompt message is assigned sender 'user' instead of 'system'.
Using 'user' as the sender for a system prompt may lead to misclassification in logic or UI. Consider using 'system' to clearly distinguish system messages.
</issue_to_address>Help me be more useful! Please click 👍 or 👎 on each comment and I'll use the feedback to improve your reviews.
fix systemprompt bug
System prompt not send to llm in the chat
Summary by Sourcery
Fix the system prompt not being sent to the LLM by enhancing the message-building logic to conditionally prepend the system prompt and updating sendMessage invocations accordingly.
Enhancements: