Skip to content

Conversation

@boston008
Copy link
Contributor

@boston008 boston008 commented Jul 17, 2025

fix systemprompt bug
System prompt not send to llm in the chat

Summary by Sourcery

Fix the system prompt not being sent to the LLM by enhancing the message-building logic to conditionally prepend the system prompt and updating sendMessage invocations accordingly.

Enhancements:

  • Integrate system prompt into the LLM message sequence by updating sendMessage calls to use buildLLMMessages
  • Refactor buildLLMMessages to guard against missing or duplicate system prompt and prepend a constructed system prompt message
  • Simplify state updates by standardizing setMessages to use the previous state callback

fix systemprompt bug
@sourcery-ai
Copy link
Contributor

sourcery-ai bot commented Jul 17, 2025

Reviewer's guide (collapsed on small PRs)

Reviewer's Guide

Refactors message assembly and send flows to ensure system prompts are always included and to streamline state handling, preventing stale message arrays.

Sequence diagram for sending a chat message with system prompt

sequenceDiagram
actor User
participant LLMChatInterface
participant llmChatService
User->>LLMChatInterface: Submit chat message
LLMChatInterface->>LLMChatInterface: buildLLMMessages([...messages, userMessage])
LLMChatInterface->>llmChatService: sendMessage(selectedProvider, messagesWithSystemPrompt, ...)
llmChatService-->>LLMChatInterface: Response chunk(s)
LLMChatInterface->>LLMChatInterface: Update messages state
Loading

Class diagram for Message and buildLLMMessages changes

classDiagram
class Message {
  +id: string
  +session_id: string
  +content: string
  +sender: string
  +timestamp: string
}
class LLMChatInterface {
  +buildLLMMessages(userMessages: Message[]): Message[]
}
LLMChatInterface --> Message: uses
Loading

File-Level Changes

Change Details Files
Simplify message state handling and avoid stale state in send flows
  • Unified setMessages calls to use functional updates without explicit type annotation
  • Computed messagesForLLM before calling sendMessage in handleSend to capture the correct message array
web/src/pages/chat/llm-chat-interface.tsx
Ensure system prompt is included in every LLM request
  • Replaced direct updatedMessages arguments with buildLLMMessages results in llmChatService.sendMessage calls
  • Applied buildLLMMessages in both tool result continuation and user send flows
web/src/pages/chat/llm-chat-interface.tsx
Refactor buildLLMMessages to handle system prompt insertion robustly
  • Added return type annotation and early return when systemPrompt is absent
  • Detected existing system prompt by id to prevent duplication
  • Fixed systemPrompt message fields (session_id fallback, sender as 'user') and updated prepend logic
web/src/pages/chat/llm-chat-interface.tsx

Tips and commands

Interacting with Sourcery

  • Trigger a new review: Comment @sourcery-ai review on the pull request.
  • Continue discussions: Reply directly to Sourcery's review comments.
  • Generate a GitHub issue from a review comment: Ask Sourcery to create an
    issue from a review comment by replying to it. You can also reply to a
    review comment with @sourcery-ai issue to create an issue from it.
  • Generate a pull request title: Write @sourcery-ai anywhere in the pull
    request title to generate a title at any time. You can also comment
    @sourcery-ai title on the pull request to (re-)generate the title at any time.
  • Generate a pull request summary: Write @sourcery-ai summary anywhere in
    the pull request body to generate a PR summary at any time exactly where you
    want it. You can also comment @sourcery-ai summary on the pull request to
    (re-)generate the summary at any time.
  • Generate reviewer's guide: Comment @sourcery-ai guide on the pull
    request to (re-)generate the reviewer's guide at any time.
  • Resolve all Sourcery comments: Comment @sourcery-ai resolve on the
    pull request to resolve all Sourcery comments. Useful if you've already
    addressed all the comments and don't want to see them anymore.
  • Dismiss all Sourcery reviews: Comment @sourcery-ai dismiss on the pull
    request to dismiss all existing Sourcery reviews. Especially useful if you
    want to start fresh with a new review - don't forget to comment
    @sourcery-ai review to trigger a new review!

Customizing Your Experience

Access your dashboard to:

  • Enable or disable review features such as the Sourcery-generated pull request
    summary, the reviewer's guide, and others.
  • Change the review language.
  • Add, remove or edit custom review instructions.
  • Adjust other review settings.

Getting Help

Copy link
Contributor

@sourcery-ai sourcery-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey @boston008 - I've reviewed your changes - here's some feedback:

  • Instead of prepending the system prompt content onto the first user message, consider sending it as a distinct system message with a system sender to preserve clear message semantics and avoid confusing the LLM or downstream logic.
  • Revert to using the functional state update form (setMessages(prev => [...prev, userMessage])) when appending messages to avoid potential stale state issues under rapid consecutive updates.
  • Remove or conditionally gate the console.debug statements to prevent internal logs from cluttering production consoles.
Prompt for AI Agents
Please address the comments from this code review:
## Overall Comments
- Instead of prepending the system prompt content onto the first user message, consider sending it as a distinct system message with a `system` sender to preserve clear message semantics and avoid confusing the LLM or downstream logic.
- Revert to using the functional state update form (`setMessages(prev => [...prev, userMessage])`) when appending messages to avoid potential stale state issues under rapid consecutive updates.
- Remove or conditionally gate the `console.debug` statements to prevent internal logs from cluttering production consoles.

Sourcery is free for open source - if you like our reviews please consider sharing them ✨
Help me be more useful! Please click 👍 or 👎 on each comment and I'll use the feedback to improve your reviews.

buildLLMMessages
@iFurySt
Copy link
Member

iFurySt commented Jul 17, 2025

@boston008 it'd be great if you could use the status to indicate the PR's status. That said, if it's still a WIP, you can switch to draft instead of keeping it open. that way I'll know you are ready to review. thanks!

@boston008 boston008 marked this pull request as draft July 23, 2025 17:12
@boston008 boston008 marked this pull request as ready for review August 12, 2025 05:26
Copy link
Contributor

@sourcery-ai sourcery-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey @boston008 - I've reviewed your changes and they look great!

Prompt for AI Agents
Please address the comments from this code review:
## Individual Comments

### Comment 1
<location> `web/src/pages/chat/llm-chat-interface.tsx:402` </location>
<code_context>
+      session_id: sessionId || '',
       content: systemPrompt,
-      sender: 'system' as const,
+      sender: 'user',
       timestamp: new Date().toISOString(),
     };
</code_context>

<issue_to_address>
System prompt message is assigned sender 'user' instead of 'system'.

Using 'user' as the sender for a system prompt may lead to misclassification in logic or UI. Consider using 'system' to clearly distinguish system messages.
</issue_to_address>

Sourcery is free for open source - if you like our reviews please consider sharing them ✨
Help me be more useful! Please click 👍 or 👎 on each comment and I'll use the feedback to improve your reviews.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants