-
Notifications
You must be signed in to change notification settings - Fork 357
Enhance LLM selection UI and update translations for improved clarity #12112
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: develop
Are you sure you want to change the base?
Enhance LLM selection UI and update translations for improved clarity #12112
Conversation
|
@wasnertobias Your PR description needs attention before it can be reviewed: Issues Found
How to Fix
This check validates that your PR description follows the PR template. A complete description helps reviewers understand your changes and speeds up the review process.
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull request overview
This PR improves the LLM selection experience by aligning wording (“Cloud AI” / “On-premise AI”) and by displaying the user’s current selection inside the LLM selection modal.
Changes:
- Updates user settings translations to use “Cloud AI” / “On-premise AI” wording instead of “external/internal LLMs”.
- Extends the LLM selection modal service/component to accept and display the current selection (adds a “Current” badge).
- Updates unit tests for the modal and LLM usage settings component to cover passing/setting the current selection.
Reviewed changes
Copilot reviewed 12 out of 12 changed files in this pull request and generated 2 comments.
Show a summary per file
| File | Description |
|---|---|
| src/main/webapp/i18n/en/userSettings.json | Renames LLM usage status strings to Cloud/On-premise wording. |
| src/main/webapp/i18n/de/userSettings.json | Same as EN, German equivalents. |
| src/main/webapp/i18n/en/llmSelectionPopup.json | Adds translation for “Current” selection badge. |
| src/main/webapp/i18n/de/llmSelectionPopup.json | Adds German translation for “Current” badge. |
| src/main/webapp/app/logos/llm-selection-popup.service.ts | open() now emits the current selection to the modal via openModal$. |
| src/main/webapp/app/logos/llm-selection-popup.component.ts | Stores and uses currentSelection when opening the modal. |
| src/main/webapp/app/logos/llm-selection-popup.component.html | Shows “Current” badge for the active choice. |
| src/main/webapp/app/logos/llm-selection-popup.component.scss | Adds styling hook for the “Current” badge. |
| src/main/webapp/app/logos/llm-selection-popup.component.spec.ts | Adds tests ensuring currentSelection is set on open. |
| src/main/webapp/app/core/user/settings/llm-usage-settings/llm-usage-settings.component.ts | Passes current decision into modal via open(currentChoice). |
| src/main/webapp/app/core/user/settings/llm-usage-settings/llm-usage-settings.component.spec.ts | Updates/extends tests to validate passing current choice. |
| src/main/webapp/app/core/user/settings/llm-usage-settings/llm-usage-settings.component.html | Switches to new translation keys for Cloud/On-premise accepted text. |
Comments suppressed due to low confidence (1)
src/main/webapp/app/core/user/settings/llm-usage-settings/llm-usage-settings.component.ts:36
choiceis a string union (LLMSelectionChoice) and will always be truthy when it resolves (including'none'). The currentif (choice)guard is therefore ineffective, and the switch silently ignores'none'. Make the intent explicit (e.g., add acase 'none': return;and/or use an exhaustive switch) to match other call sites (e.g., request-feedback-button) and avoid future unhandled options.
async openSelectionModal(): Promise<void> {
const currentChoice = this.mapDecisionToChoice(this.currentLLMSelectionDecision());
const choice = await this.llmModalService.open(currentChoice);
if (choice) {
// Map the Choice to the Enum
let decision: LLMSelectionDecision;
switch (choice) {
case 'cloud':
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| .badge-current { | ||
| background: var(--success); | ||
| } |
Copilot
AI
Feb 6, 2026
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
.badge-current only sets a background color, but it is not included in the shared badge styling selector (.badge, .badge-experimental, .badge-recommended). As a result, the “Current” badge will miss padding/border-radius/font-size/text color and look inconsistent. Include .badge-current in the shared selector or duplicate the shared badge styles for it.
| it('should handle cloud choice', async () => { | ||
| (llmModalService.open as jest.Mock).mockResolvedValue('cloud'); | ||
| const updateSpy = jest.spyOn(component, 'updateLLMSelectionDecision'); | ||
|
|
||
| await component.openSelectionModal(); | ||
|
|
||
| expect(llmModalService.open).toHaveBeenCalledOnce(); | ||
| expect(llmModalService.open).toHaveBeenCalledWith('cloud'); | ||
| expect(updateSpy).toHaveBeenCalledWith(LLMSelectionDecision.CLOUD_AI); |
Copilot
AI
Feb 6, 2026
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
These openSelectionModal tests call component.openSelectionModal() without triggering ngOnInit()/fixture.detectChanges(). With the new implementation, currentLLMSelectionDecision() is still undefined, so llmModalService.open(...) will be called with undefined (not 'cloud') unless the component is initialized first. Initialize the component state (call component.ngOnInit() or fixture.detectChanges()) before asserting the argument, and adjust the repeated expectations below accordingly.
|
@wasnertobias Test coverage could not be fully measured because some tests failed. Please check the workflow logs for details. |
WalkthroughThe PR resolves inconsistent naming and missing visual feedback in the LLM selection flow. Translation keys are renamed from "external/internal" to "cloud/on-premise," the modal now receives and displays the current selection as a visual badge, and supporting tests and service signatures are updated accordingly. Changes
Sequence DiagramsequenceDiagram
participant User
participant Settings as LLM Usage<br/>Settings Component
participant Service as LLM Selection<br/>Modal Service
participant Modal as LLM Selection<br/>Modal Component
User->>Settings: Click "Change LLM"
Settings->>Settings: Map current LLMSelectionDecision<br/>to choice string
Settings->>Service: open(currentChoice)
Service->>Service: Emit currentChoice<br/>via openModalSubject
Service->>Modal: openModal$ stream
Modal->>Modal: Set currentSelection
Modal->>Modal: Render badge on<br/>matching option
Modal->>User: Display modal with<br/>current selection highlighted
User->>Modal: Select new option
Modal->>Service: emitChoice(selected)
Service-->>Settings: Return LLMSelectionChoice
Settings->>Settings: Map choice back to<br/>LLMSelectionDecision
Settings->>Settings: updateLLMSelectionDecision()
Estimated code review effort🎯 3 (Moderate) | ⏱️ ~25 minutes 🚥 Pre-merge checks | ✅ 5✅ Passed checks (5 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing touches
🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 2
🤖 Fix all issues with AI agents
In
`@src/main/webapp/app/core/user/settings/llm-usage-settings/llm-usage-settings.component.spec.ts`:
- Line 106: The tests call component.openSelectionModal() before initializing
the component, so currentLLMSelectionDecision stays undefined and
llmModalService.open receives undefined; fix by either calling
component.ngOnInit() (or fixture.detectChanges()) before openSelectionModal() in
the tests at the failing assertions (tests around openSelectionModal /
mapDecisionToChoice) or change the expected assertion to the default
(expect(llmModalService.open).toHaveBeenCalledWith(undefined)); update the six
assertions at the indicated locations (lines asserting open with 'cloud') to
follow the same pattern as the later tests that call ngOnInit.
In `@src/main/webapp/app/logos/llm-selection-popup.component.scss`:
- Around line 251-253: The `.badge-current` class is missing the shared badge
styles; update the shared selector that currently targets `.badge,
.badge-experimental, .badge-recommended` to also include `.badge-current` so the
"Current" badge receives the common padding, border-radius, font-size,
font-weight, and color rules (ensure you modify the selector where the shared
styles are defined, not the single-line `.badge-current { background:
var(--success); }` rule).
🧹 Nitpick comments (2)
src/main/webapp/app/logos/llm-selection-popup.component.ts (1)
27-27: Consider using an Angular Signal forcurrentSelection.The coding guidelines mandate Angular Signals for component state. The new
currentSelectionfield is a plain class field. While the existing fields (isVisible,isOnPremiseEnabled) also use the legacy pattern, new additions are a good opportunity to start migrating. For example:currentSelection = signal<LLMSelectionChoice | undefined>(undefined);This would also eliminate the need for manual
cdr.detectChanges()on line 36 since signals automatically participate in change detection.That said, a full migration of this file is out of scope for this PR.
As per coding guidelines, "Use Angular Signals for component state and obtain dependencies via inject(); legacy decorator-based state patterns and constructor-based dependency injection are prohibited."
src/main/webapp/app/core/user/settings/llm-usage-settings/llm-usage-settings.component.ts (1)
32-48: Consider simplifying the reverse mapping.The
decisionvariable and per-caseupdateLLMSelectionDecisioncalls are repetitive. You could eliminate the intermediate variable:♻️ Optional simplification
if (choice) { - // Map the Choice to the Enum - let decision: LLMSelectionDecision; - switch (choice) { - case 'cloud': - decision = LLMSelectionDecision.CLOUD_AI; - this.updateLLMSelectionDecision(decision); - break; - case 'local': - decision = LLMSelectionDecision.LOCAL_AI; - this.updateLLMSelectionDecision(decision); - break; - case 'no_ai': - decision = LLMSelectionDecision.NO_AI; - this.updateLLMSelectionDecision(decision); - break; - } + const choiceToDecision: Partial<Record<LLMSelectionChoice, LLMSelectionDecision>> = { + cloud: LLMSelectionDecision.CLOUD_AI, + local: LLMSelectionDecision.LOCAL_AI, + no_ai: LLMSelectionDecision.NO_AI, + }; + const decision = choiceToDecision[choice]; + if (decision !== undefined) { + this.updateLLMSelectionDecision(decision); + } }This also pairs well with
mapDecisionToChoice— both directions share a single source of truth if you extract the map to a module-level constant.
| await component.openSelectionModal(); | ||
|
|
||
| expect(llmModalService.open).toHaveBeenCalledOnce(); | ||
| expect(llmModalService.open).toHaveBeenCalledWith('cloud'); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Incorrect assertions — ngOnInit is never called, so the modal receives undefined.
In these tests, neither fixture.detectChanges() nor component.ngOnInit() is called before openSelectionModal(). The currentLLMSelectionDecision signal retains its initial undefined value, so mapDecisionToChoice(undefined) returns undefined, and llmModalService.open is invoked with undefined — not 'cloud'.
The new tests at lines 160–197 correctly call component.ngOnInit() first and have accurate assertions. These six lines should follow the same pattern, or simply assert the default:
🐛 Fix the assertions
- expect(llmModalService.open).toHaveBeenCalledWith('cloud');
+ expect(llmModalService.open).toHaveBeenCalledWith(undefined);Apply at lines 106, 116, 126, 136, 146, and 156. Alternatively, add component.ngOnInit(); before openSelectionModal() in each test and keep 'cloud'.
Also applies to: 116-116, 126-126, 136-136, 146-146, 156-156
🤖 Prompt for AI Agents
In
`@src/main/webapp/app/core/user/settings/llm-usage-settings/llm-usage-settings.component.spec.ts`
at line 106, The tests call component.openSelectionModal() before initializing
the component, so currentLLMSelectionDecision stays undefined and
llmModalService.open receives undefined; fix by either calling
component.ngOnInit() (or fixture.detectChanges()) before openSelectionModal() in
the tests at the failing assertions (tests around openSelectionModal /
mapDecisionToChoice) or change the expected assertion to the default
(expect(llmModalService.open).toHaveBeenCalledWith(undefined)); update the six
assertions at the indicated locations (lines asserting open with 'cloud') to
follow the same pattern as the later tests that call ngOnInit.
| .badge-current { | ||
| background: var(--success); | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Bug: .badge-current is missing shared badge styles (padding, border-radius, font-size, color, etc.).
The shared badge properties at lines 228–237 are defined on .badge, .badge-experimental, .badge-recommended but .badge-current is not included. As a result, the "Current" badge will render without padding, border-radius, white text color, font-size, or font-weight.
🐛 Proposed fix: add `.badge-current` to the shared selector
.badge,
.badge-experimental,
-.badge-recommended {
+.badge-recommended,
+.badge-current {
padding: 4px 12px;
border-radius: 20px;
font-size: 13px;
font-weight: 600;
color: var(--white);
white-space: nowrap;
}🤖 Prompt for AI Agents
In `@src/main/webapp/app/logos/llm-selection-popup.component.scss` around lines
251 - 253, The `.badge-current` class is missing the shared badge styles; update
the shared selector that currently targets `.badge, .badge-experimental,
.badge-recommended` to also include `.badge-current` so the "Current" badge
receives the common padding, border-radius, font-size, font-weight, and color
rules (ensure you modify the selector where the shared styles are defined, not
the single-line `.badge-current { background: var(--success); }` rule).
Summary
Fixes #12111
Checklist
General
Server
Client
authoritiesto all new routes and checked the course groups for displaying navigation elements (links, buttons).Changes affecting Programming Exercises
Motivation and Context
Description
Steps for Testing
Prerequisites:
Exam Mode Testing
Prerequisites:
Testserver States
You can manage test servers using Helios. Check environment statuses in the environment list. To deploy to a test server, go to the CI/CD page, find your PR or branch, and trigger the deployment.
Review Progress
Performance Review
Code Review
Manual Tests
Exam Mode Test
Performance Tests
Test Coverage
Warning: Client tests failed. Coverage could not be fully measured. Please check the workflow logs.
Last updated: 2026-02-06 15:46:11 UTC
Screenshots
Summary by CodeRabbit
Release Notes
New Features
Translations