Skip to content

Conversation

@wasnertobias
Copy link
Member

@wasnertobias wasnertobias commented Feb 6, 2026

Summary

Fixes #12111

Checklist

General

Server

  • Important: I implemented the changes with a very good performance and prevented too many (unnecessary) and too complex database calls.
  • I strictly followed the principle of data economy for all database calls.
  • I strictly followed the server coding and design guidelines and the REST API guidelines.
  • I added multiple integration tests (Spring) related to the features (with a high test coverage).
  • I added pre-authorization annotations according to the guidelines and checked the course groups for all new REST Calls (security).
  • I documented the Java code using JavaDoc style.

Client

  • Important: I implemented the changes with a very good performance, prevented too many (unnecessary) REST calls and made sure the UI is responsive, even with large data (e.g. using paging).
  • I strictly followed the principle of data economy for all client-server REST calls.
  • I strictly followed the client coding guidelines.
  • I strictly followed the AET UI-UX guidelines.
  • Following the theming guidelines, I specified colors only in the theming variable files and checked that the changes look consistent in both the light and the dark theme.
  • I added multiple integration tests (Jest) related to the features (with a high test coverage), while following the test guidelines.
  • I added authorities to all new routes and checked the course groups for displaying navigation elements (links, buttons).
  • I documented the TypeScript code using JSDoc style.
  • I added multiple screenshots/screencasts of my UI changes.
  • I translated all newly inserted strings into English and German.

Changes affecting Programming Exercises

  • High priority: I tested all changes and their related features with all corresponding user types on a test server configured with the integrated lifecycle setup (LocalVC and LocalCI).
  • I tested all changes and their related features with all corresponding user types on a test server configured with LocalVC and Jenkins.

Motivation and Context

Description

Steps for Testing

Prerequisites:

  • 1 Instructor
  • 2 Students
  • 1 Programming Exercise with Complaints enabled
  1. Log in to Artemis
  2. Navigate to Course Administration
  3. ...

Exam Mode Testing

Prerequisites:

  • 1 Instructor
  • 2 Students
  • 1 Exam with a Programming Exercise
  1. Log in to Artemis
  2. Participate in the exam as a student
  3. Make sure that the UI of the programming exercise in the exam mode stays unchanged. You can use the exam mode documentation as reference.
  4. ...

Testserver States

You can manage test servers using Helios. Check environment statuses in the environment list. To deploy to a test server, go to the CI/CD page, find your PR or branch, and trigger the deployment.

Review Progress

Performance Review

  • I (as a reviewer) confirm that the client changes (in particular related to REST calls and UI responsiveness) are implemented with a very good performance even for very large courses with more than 2000 students.
  • I (as a reviewer) confirm that the server changes (in particular related to database calls) are implemented with a very good performance even for very large courses with more than 2000 students.

Code Review

  • Code Review 1
  • Code Review 2

Manual Tests

  • Test 1
  • Test 2

Exam Mode Test

  • Test 1
  • Test 2

Performance Tests

  • Test 1
  • Test 2

Test Coverage

Warning: Client tests failed. Coverage could not be fully measured. Please check the workflow logs.

Last updated: 2026-02-06 15:46:11 UTC

Screenshots

Summary by CodeRabbit

Release Notes

  • New Features

    • Added a visual indicator (badge) showing your currently selected AI option within the selection interface
    • AI selection modal now displays your existing choice when opened, making it easier to review or change your preference
  • Translations

    • Updated terminology for clarity: "External" now labeled as "Cloud AI" and "Internal" as "On-Premise AI"
    • Added "Current Selection" label support in multiple languages

@wasnertobias wasnertobias self-assigned this Feb 6, 2026
@wasnertobias wasnertobias requested a review from krusche as a code owner February 6, 2026 15:40
Copilot AI review requested due to automatic review settings February 6, 2026 15:40
@github-project-automation github-project-automation bot moved this to Work In Progress in Artemis Development Feb 6, 2026
@github-actions github-actions bot added client Pull requests that update TypeScript code. (Added Automatically!) core Pull requests that affect the corresponding module labels Feb 6, 2026
@github-actions
Copy link

github-actions bot commented Feb 6, 2026

@wasnertobias Your PR description needs attention before it can be reviewed:

Issues Found

  1. No checkboxes are checked in the PR description
  2. Motivation/Context section is missing or needs improvement
  3. Description section is missing or needs improvement
  4. Screenshots are missing but this PR contains visual/UI changes

How to Fix

  • Check the boxes that apply to your changes in the Checklist section
  • Add a brief motivation/context explaining why this change is needed.
  • Add a real description in the Description section detailing what changed.
  • Include before/after images.

This check validates that your PR description follows the PR template. A complete description helps reviewers understand your changes and speeds up the review process.

Note: This description validation is an experimental feature. If you observe false positives, please send a DM with a link to the wrong comment to Patrick Bassner on Slack. Thank you!

Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR improves the LLM selection experience by aligning wording (“Cloud AI” / “On-premise AI”) and by displaying the user’s current selection inside the LLM selection modal.

Changes:

  • Updates user settings translations to use “Cloud AI” / “On-premise AI” wording instead of “external/internal LLMs”.
  • Extends the LLM selection modal service/component to accept and display the current selection (adds a “Current” badge).
  • Updates unit tests for the modal and LLM usage settings component to cover passing/setting the current selection.

Reviewed changes

Copilot reviewed 12 out of 12 changed files in this pull request and generated 2 comments.

Show a summary per file
File Description
src/main/webapp/i18n/en/userSettings.json Renames LLM usage status strings to Cloud/On-premise wording.
src/main/webapp/i18n/de/userSettings.json Same as EN, German equivalents.
src/main/webapp/i18n/en/llmSelectionPopup.json Adds translation for “Current” selection badge.
src/main/webapp/i18n/de/llmSelectionPopup.json Adds German translation for “Current” badge.
src/main/webapp/app/logos/llm-selection-popup.service.ts open() now emits the current selection to the modal via openModal$.
src/main/webapp/app/logos/llm-selection-popup.component.ts Stores and uses currentSelection when opening the modal.
src/main/webapp/app/logos/llm-selection-popup.component.html Shows “Current” badge for the active choice.
src/main/webapp/app/logos/llm-selection-popup.component.scss Adds styling hook for the “Current” badge.
src/main/webapp/app/logos/llm-selection-popup.component.spec.ts Adds tests ensuring currentSelection is set on open.
src/main/webapp/app/core/user/settings/llm-usage-settings/llm-usage-settings.component.ts Passes current decision into modal via open(currentChoice).
src/main/webapp/app/core/user/settings/llm-usage-settings/llm-usage-settings.component.spec.ts Updates/extends tests to validate passing current choice.
src/main/webapp/app/core/user/settings/llm-usage-settings/llm-usage-settings.component.html Switches to new translation keys for Cloud/On-premise accepted text.
Comments suppressed due to low confidence (1)

src/main/webapp/app/core/user/settings/llm-usage-settings/llm-usage-settings.component.ts:36

  • choice is a string union (LLMSelectionChoice) and will always be truthy when it resolves (including 'none'). The current if (choice) guard is therefore ineffective, and the switch silently ignores 'none'. Make the intent explicit (e.g., add a case 'none': return; and/or use an exhaustive switch) to match other call sites (e.g., request-feedback-button) and avoid future unhandled options.
    async openSelectionModal(): Promise<void> {
        const currentChoice = this.mapDecisionToChoice(this.currentLLMSelectionDecision());
        const choice = await this.llmModalService.open(currentChoice);

        if (choice) {
            // Map the Choice to the Enum
            let decision: LLMSelectionDecision;
            switch (choice) {
                case 'cloud':

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Comment on lines +251 to +253
.badge-current {
background: var(--success);
}
Copy link

Copilot AI Feb 6, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

.badge-current only sets a background color, but it is not included in the shared badge styling selector (.badge, .badge-experimental, .badge-recommended). As a result, the “Current” badge will miss padding/border-radius/font-size/text color and look inconsistent. Include .badge-current in the shared selector or duplicate the shared badge styles for it.

Copilot uses AI. Check for mistakes.
Comment on lines 100 to 107
it('should handle cloud choice', async () => {
(llmModalService.open as jest.Mock).mockResolvedValue('cloud');
const updateSpy = jest.spyOn(component, 'updateLLMSelectionDecision');

await component.openSelectionModal();

expect(llmModalService.open).toHaveBeenCalledOnce();
expect(llmModalService.open).toHaveBeenCalledWith('cloud');
expect(updateSpy).toHaveBeenCalledWith(LLMSelectionDecision.CLOUD_AI);
Copy link

Copilot AI Feb 6, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

These openSelectionModal tests call component.openSelectionModal() without triggering ngOnInit()/fixture.detectChanges(). With the new implementation, currentLLMSelectionDecision() is still undefined, so llmModalService.open(...) will be called with undefined (not 'cloud') unless the component is initialized first. Initialize the component state (call component.ngOnInit() or fixture.detectChanges()) before asserting the argument, and adjust the repeated expectations below accordingly.

Copilot uses AI. Check for mistakes.
@github-actions
Copy link

github-actions bot commented Feb 6, 2026

@wasnertobias Test coverage could not be fully measured because some tests failed. Please check the workflow logs for details.

@coderabbitai
Copy link
Contributor

coderabbitai bot commented Feb 6, 2026

Walkthrough

The PR resolves inconsistent naming and missing visual feedback in the LLM selection flow. Translation keys are renamed from "external/internal" to "cloud/on-premise," the modal now receives and displays the current selection as a visual badge, and supporting tests and service signatures are updated accordingly.

Changes

Cohort / File(s) Summary
LLM Usage Settings Component
src/main/webapp/app/core/user/settings/llm-usage-settings/llm-usage-settings.component.html, llm-usage-settings.component.ts, llm-usage-settings.component.spec.ts
Updated to use renamed translation keys (cloudAccepted, onPremiseAccepted), added logic to map current LLMSelectionDecision to choice string before passing to modal service, enhanced tests to verify modal opens with correct current selection parameter.
LLM Selection Modal Component
src/main/webapp/app/logos/llm-selection-popup.component.ts, llm-selection-popup.component.html, llm-selection-popup.component.scss, llm-selection-popup.component.spec.ts
Added currentSelection field and subscription logic, updated template to conditionally render "current selection" badge on AI options, added CSS class .badge-current with success color, tests updated to handle LLMSelectionChoice type and verify currentSelection behavior.
LLM Selection Modal Service
src/main/webapp/app/logos/llm-selection-popup.service.ts
Modified open() method signature to accept optional currentSelection parameter, changed openModalSubject type from Subject to Subject<LLMSelectionChoice | undefined> to emit the current selection value.
Translation Files
src/main/webapp/i18n/en/llmSelectionPopup.json, i18n/de/llmSelectionPopup.json, i18n/en/userSettings.json, i18n/de/userSettings.json
Added "currentSelection" label translations in llmSelectionPopup files; refactored userSettings keys by removing externalAccepted/Declined and internalAccepted/Declined, replacing with cloudAccepted and onPremiseAccepted.

Sequence Diagram

sequenceDiagram
    participant User
    participant Settings as LLM Usage<br/>Settings Component
    participant Service as LLM Selection<br/>Modal Service
    participant Modal as LLM Selection<br/>Modal Component
    
    User->>Settings: Click "Change LLM"
    Settings->>Settings: Map current LLMSelectionDecision<br/>to choice string
    Settings->>Service: open(currentChoice)
    Service->>Service: Emit currentChoice<br/>via openModalSubject
    Service->>Modal: openModal$ stream
    Modal->>Modal: Set currentSelection
    Modal->>Modal: Render badge on<br/>matching option
    Modal->>User: Display modal with<br/>current selection highlighted
    User->>Modal: Select new option
    Modal->>Service: emitChoice(selected)
    Service-->>Settings: Return LLMSelectionChoice
    Settings->>Settings: Map choice back to<br/>LLMSelectionDecision
    Settings->>Settings: updateLLMSelectionDecision()
Loading

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~25 minutes

🚥 Pre-merge checks | ✅ 5
✅ Passed checks (5 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The title accurately describes the main changes: enhancing the LLM selection UI and updating translations for consistency and clarity, which directly addresses the PR objectives.
Linked Issues check ✅ Passed All coding requirements from issue #12111 are met: consistent naming (external→cloud, internal→on-premise) implemented across settings and modal, and the modal now displays the currently selected option via currentSelection field.
Out of Scope Changes check ✅ Passed All changes are directly related to the linked issue objectives: renaming translation keys for consistency, adding currentSelection UI indicator, and updating component logic to pass and display the current selection.
Docstring Coverage ✅ Passed No functions found in the changed files to evaluate docstring coverage. Skipping docstring coverage check.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing touches
  • 📝 Generate docstrings
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment
  • Commit unit tests in branch bugfix/development/12111-llm-selection-ui-improvements

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

🤖 Fix all issues with AI agents
In
`@src/main/webapp/app/core/user/settings/llm-usage-settings/llm-usage-settings.component.spec.ts`:
- Line 106: The tests call component.openSelectionModal() before initializing
the component, so currentLLMSelectionDecision stays undefined and
llmModalService.open receives undefined; fix by either calling
component.ngOnInit() (or fixture.detectChanges()) before openSelectionModal() in
the tests at the failing assertions (tests around openSelectionModal /
mapDecisionToChoice) or change the expected assertion to the default
(expect(llmModalService.open).toHaveBeenCalledWith(undefined)); update the six
assertions at the indicated locations (lines asserting open with 'cloud') to
follow the same pattern as the later tests that call ngOnInit.

In `@src/main/webapp/app/logos/llm-selection-popup.component.scss`:
- Around line 251-253: The `.badge-current` class is missing the shared badge
styles; update the shared selector that currently targets `.badge,
.badge-experimental, .badge-recommended` to also include `.badge-current` so the
"Current" badge receives the common padding, border-radius, font-size,
font-weight, and color rules (ensure you modify the selector where the shared
styles are defined, not the single-line `.badge-current { background:
var(--success); }` rule).
🧹 Nitpick comments (2)
src/main/webapp/app/logos/llm-selection-popup.component.ts (1)

27-27: Consider using an Angular Signal for currentSelection.

The coding guidelines mandate Angular Signals for component state. The new currentSelection field is a plain class field. While the existing fields (isVisible, isOnPremiseEnabled) also use the legacy pattern, new additions are a good opportunity to start migrating. For example:

currentSelection = signal<LLMSelectionChoice | undefined>(undefined);

This would also eliminate the need for manual cdr.detectChanges() on line 36 since signals automatically participate in change detection.

That said, a full migration of this file is out of scope for this PR.

As per coding guidelines, "Use Angular Signals for component state and obtain dependencies via inject(); legacy decorator-based state patterns and constructor-based dependency injection are prohibited."

src/main/webapp/app/core/user/settings/llm-usage-settings/llm-usage-settings.component.ts (1)

32-48: Consider simplifying the reverse mapping.

The decision variable and per-case updateLLMSelectionDecision calls are repetitive. You could eliminate the intermediate variable:

♻️ Optional simplification
     if (choice) {
-        // Map the Choice to the Enum
-        let decision: LLMSelectionDecision;
-        switch (choice) {
-            case 'cloud':
-                decision = LLMSelectionDecision.CLOUD_AI;
-                this.updateLLMSelectionDecision(decision);
-                break;
-            case 'local':
-                decision = LLMSelectionDecision.LOCAL_AI;
-                this.updateLLMSelectionDecision(decision);
-                break;
-            case 'no_ai':
-                decision = LLMSelectionDecision.NO_AI;
-                this.updateLLMSelectionDecision(decision);
-                break;
-        }
+        const choiceToDecision: Partial<Record<LLMSelectionChoice, LLMSelectionDecision>> = {
+            cloud: LLMSelectionDecision.CLOUD_AI,
+            local: LLMSelectionDecision.LOCAL_AI,
+            no_ai: LLMSelectionDecision.NO_AI,
+        };
+        const decision = choiceToDecision[choice];
+        if (decision !== undefined) {
+            this.updateLLMSelectionDecision(decision);
+        }
     }

This also pairs well with mapDecisionToChoice — both directions share a single source of truth if you extract the map to a module-level constant.

await component.openSelectionModal();

expect(llmModalService.open).toHaveBeenCalledOnce();
expect(llmModalService.open).toHaveBeenCalledWith('cloud');
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

Incorrect assertions — ngOnInit is never called, so the modal receives undefined.

In these tests, neither fixture.detectChanges() nor component.ngOnInit() is called before openSelectionModal(). The currentLLMSelectionDecision signal retains its initial undefined value, so mapDecisionToChoice(undefined) returns undefined, and llmModalService.open is invoked with undefined — not 'cloud'.

The new tests at lines 160–197 correctly call component.ngOnInit() first and have accurate assertions. These six lines should follow the same pattern, or simply assert the default:

🐛 Fix the assertions
-            expect(llmModalService.open).toHaveBeenCalledWith('cloud');
+            expect(llmModalService.open).toHaveBeenCalledWith(undefined);

Apply at lines 106, 116, 126, 136, 146, and 156. Alternatively, add component.ngOnInit(); before openSelectionModal() in each test and keep 'cloud'.

Also applies to: 116-116, 126-126, 136-136, 146-146, 156-156

🤖 Prompt for AI Agents
In
`@src/main/webapp/app/core/user/settings/llm-usage-settings/llm-usage-settings.component.spec.ts`
at line 106, The tests call component.openSelectionModal() before initializing
the component, so currentLLMSelectionDecision stays undefined and
llmModalService.open receives undefined; fix by either calling
component.ngOnInit() (or fixture.detectChanges()) before openSelectionModal() in
the tests at the failing assertions (tests around openSelectionModal /
mapDecisionToChoice) or change the expected assertion to the default
(expect(llmModalService.open).toHaveBeenCalledWith(undefined)); update the six
assertions at the indicated locations (lines asserting open with 'cloud') to
follow the same pattern as the later tests that call ngOnInit.

Comment on lines +251 to +253
.badge-current {
background: var(--success);
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Bug: .badge-current is missing shared badge styles (padding, border-radius, font-size, color, etc.).

The shared badge properties at lines 228–237 are defined on .badge, .badge-experimental, .badge-recommended but .badge-current is not included. As a result, the "Current" badge will render without padding, border-radius, white text color, font-size, or font-weight.

🐛 Proposed fix: add `.badge-current` to the shared selector
 .badge,
 .badge-experimental,
-.badge-recommended {
+.badge-recommended,
+.badge-current {
     padding: 4px 12px;
     border-radius: 20px;
     font-size: 13px;
     font-weight: 600;
     color: var(--white);
     white-space: nowrap;
 }
🤖 Prompt for AI Agents
In `@src/main/webapp/app/logos/llm-selection-popup.component.scss` around lines
251 - 253, The `.badge-current` class is missing the shared badge styles; update
the shared selector that currently targets `.badge, .badge-experimental,
.badge-recommended` to also include `.badge-current` so the "Current" badge
receives the common padding, border-radius, font-size, font-weight, and color
rules (ensure you modify the selector where the shared styles are defined, not
the single-line `.badge-current { background: var(--success); }` rule).

@github-project-automation github-project-automation bot moved this from Work In Progress to Ready For Review in Artemis Development Feb 6, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

client Pull requests that update TypeScript code. (Added Automatically!) core Pull requests that affect the corresponding module

Projects

Status: Ready For Review

Development

Successfully merging this pull request may close these issues.

LLM usage selection - inconsistent naming and UI improvements

1 participant