Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(lightspeed): integrate lightspeed backend #2501

Merged

Conversation

karthikjeeyar
Copy link
Contributor

@karthikjeeyar karthikjeeyar commented Nov 7, 2024

Fixes

https://issues.redhat.com/browse/RHDHPAI-26
https://issues.redhat.com/browse/RHDHPAI-38

Description

This PR replaces backstage proxy backend with lightspeed backend and supports following features:

  1. Chat history
  2. Multiple Chats
  3. Create new Chat conversation
  4. Delete chat conversation
  5. Search conversation history
  6. Auto scroll to the bottom when the message is streaming.

Screenshots

image

Search results:

searchResult

welcome prompt:

image

Video:

Lightspeed_demo.mov

How to test:

  1. Follow the lightspeed backend plugin README to integrate lightspeed backend in your Backstage instance.

Unit tests:

   PASS  src/hooks/__tests__/useDeleteConversation.test.tsx
  useDeleteConversation
    ✓ calls deleteConversation API and updates cache correctly (20 ms)
    ✓ invalidates cache on success if invalidateCache is true (8 ms)
    ✓ reverts cache on error (4 ms)
  PASS  src/hooks/__tests__/useConversationMessages.test.tsx
useFetchConversations
  ✓ should return conversations data when fetch is successful (73 ms)
  ✓ should handle loading state (5 ms)
  ✓ should handle errors (56 ms)
useConversationMesages
  ✓ should initialize conversations with the given conversationId (5 ms)
  ✓ should update current conversation when conversationId changes (6 ms)
  ✓ should call onComplete when streaming is done (7 ms)
  ✓ should handle the invalid json error (41 ms)
  ✓ should handle input prompt and update conversations with user and bot messages (4 ms)
  ✓ should update last bot message in conversation after API response (8 ms)
  ✓ should surface API error if last bot message failed (76 ms)
  ✓ should have scrollToBottomRef defined (55 ms)
 PASS  src/utils/__tests__/lightspeed-chatbot-utils.test.ts
  getTimestampVariablesString
    ✓ should add a leading zero if the number is less than 10 (2 ms)
    ✓ should return the number as a string if it is 10 or greater
  getTimestamp
    ✓ should format a given timestamp correctly
    ✓ should handle single-digit day, month, hour, minute, and second (1 ms)
    ✓ should handle end-of-year timestamps correctly
    ✓ should handle the beginning of the epoch (0 timestamp) (1 ms)
    ✓ should handle timestamps with daylight saving time shifts
  splitJsonStrings
    ✓ should return the entire string in an array if no `}{` pattern is found (1 ms)
    ✓ should split a concatenated JSON string into individual JSON strings
    ✓ should handle a JSON string with multiple concatenated objects correctly (1 ms)
    ✓ should handle a JSON string with edge case of empty objects
    ✓ should handle a JSON string with nested braces correctly
  createMessage
    ✓ should create a user message with default values (1 ms)
    ✓ should create a bot message with custom values
  createUserMessage
    ✓ should create a user message with default name if name is not provided
    ✓ should create a user message with provided name
  createBotMessage
    ✓ should create a bot message with provided properties (1 ms)
  getMessageData
    ✓ should return content and timestamp from message.kwargs
    ✓ should handle missing kwargs properties gracefully (33 ms)
  getCategorizeMessages
    ✓ categorizes messages correctly (37 ms)

Copy link

changeset-bot bot commented Nov 7, 2024

⚠️ No Changeset found

Latest commit: 4ad232d

Merging this PR will not cause a version bump for any packages. If these changes should not result in a new version, you're good to go. If these changes should result in a version bump, you need to add a changeset.

This PR includes no changesets

When changesets are added to this PR, you'll see the packages that this PR includes changesets for and the associated semver types

Click here to learn what changesets are, and how to add one.

Click here if you're a maintainer who wants to add a changeset to this PR

@karthikjeeyar karthikjeeyar changed the title feat(lightspeed): Integrate lightspeed backend feat(lightspeed): integrate lightspeed backend Nov 7, 2024
@karthikjeeyar karthikjeeyar force-pushed the integrate-lightspeed-backend branch 5 times, most recently from 4a00065 to d3edc1d Compare November 8, 2024 16:04
@yangcao77
Copy link
Contributor

tested on local backstage & worked for me
Screenshot 2024-11-08 at 1 52 34 PM

Copy link
Contributor

@yangcao77 yangcao77 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

generally looks good to me from backend perspective, just a comment regarding switch to use fetchApi backstage provided.

otherthan that, I'm wondering if we should get a review from UX as well?
I noticed in the screen recording @karthikjeeyar uploaded. the model list takes a fairly long time to display for a new chat session. I'm wondering do we need to fetch the list upon every new session?
also the chat session shows after received the first AI response. I'm wondering if a session should be created once after a newconversation_id has been generated and responded, even without any summary?

@karthikjeeyar
Copy link
Contributor Author

otherthan that, I'm wondering if we should get a review from UX as well?

Yes, I have already shared the PR with April in our team channel and updated the PR based on her and Ben's comments, tagging @aprilma419 again for getting +1 in the PR

I noticed in the screen recording @karthikjeeyar uploaded. the model list takes a fairly long time to display for a new chat session. I'm wondering do we need to fetch the list upon every new session?

No, it is not fetched upon every new session, models are fetched only on initial load but conversations in sidebar loaded immediately as I have cached that request. I can add caching to the models fetch call as well, as it will not be frequently changing.

also the chat session shows after received the first AI response. I'm wondering if a session should be created once after a new conversation_id has been generated and responded, even without any summary?

As we have discussed in the slack, summary will not be re-analyzed once it is created and stored in db right? so I am waiting for the first AI response to be complete to generate the summary at the end of first conversation between human and AI.

Copy link

sonarcloud bot commented Nov 15, 2024

Copy link
Contributor

@yangcao77 yangcao77 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

changes looks good to me

@openshift-ci openshift-ci bot added the lgtm label Nov 15, 2024
@openshift-merge-bot openshift-merge-bot bot merged commit 7c8e029 into janus-idp:main Nov 15, 2024
8 checks passed
Copy link
Contributor

@rohitkrai03 rohitkrai03 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

/approve

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants