Skip to content

Streamlit based Gen AI chatbot that responds to user queries from the repository of stored documents

License

Notifications You must be signed in to change notification settings

gkamboj/LogSenseAI

Repository files navigation

LogSense AI

LogSense AI is a generative AI chatbot built on Streamlit that effectively addresses user queries by utilizing a vector-based document database. It retrieves the most relevant documents using cosine similarity search for each user inquiry. By employing the Retrieval-Augmented Generation (RAG) framework, the retrieved documents serve as context for generating responses through a large language model (LLM) using LangChain. Currently, the documents stored in the database consist of Knowledge Base Articles (KBAs) related to SAP Commerce Cloud, sourced from SAP's support portal. Please note that these documents are accessible only within SAP's network and are not included in this repository.

Table of Contents

Architecture

The application connects to the LLM via an SAP-provided proxy as well as directly through LangChain. Below is the architecture diagram illustrating the connection through the SAP proxy: Architecture Diagram

Setup Instructions

After cloning the repository, follow these steps to set up the project:

  1. Set up your local development Environment:

    • Download and install JetBrains PyCharm IDE or your preferred IDE.
    • The following instructions will focus on PyCharm, but most IDEs provide similar features.
  2. Open the project:

    • In PyCharm, navigate to File -> Open and select the cloned repository folder.
  3. Set up a local virtual environment:

    • Go to Settings > Project: LogSenseAI > Python Interpreter > Add Interpreter.
    • Choose Add Local Interpreter > Virtualenv Environment.
      1. Select Environment -> New.
      2. Set Base Interpreter to your installed Python version (e.g., Python 3.x).
      3. Click OK.
  4. Install dependencies:

    • Run the following commands in your terminal:
      pip install "generative-ai-hub-sdk[all]==1.2.2" --extra-index-url https://int.repositories.cloud.sap/artifactory/api/pypi/proxy-deploy-releases-hyperspace-pypi/simple/
      pip install -r requirements.txt
    • If you prefer to connect directly to the LLM via OpenAI instead of using the SAP proxy, you can skip the installation of generative-ai-hub-sdk and install required LangChain libraries instead.
  5. Run the application:

    streamlit run app.py
  6. Access the user interface: Open your web browser and navigate to http://localhost:8500/ (or the appropriate port if different).

Usage

Once the application is running, you can interact with LogSense AI by entering your queries in the provided interface. The chatbot will retrieve relevant information from the document database and generate responses based on the context.

Screenshots

Here are some screenshots showcasing working deployments of the application.

  • Startup: 1-startup

  • Chat inteerface selection (See Future Improvements for details of LOG FILE interaction type): 2-interaction-types

  • [CHAT interaction type] Query for which response is available from stored documents. Note that response also shares link to source documents: 3-response from context

  • [CHAT interaction type] Query for which response is not available from stored documents:

    • If flag app.allowWithoutContextResults is false, that is results outside context are not allowed: 4 1-out-of-context

    • If flag app.allowWithoutContextResults is true: 4 2-out-of-context

Customizations

You can easily customize the application in the following ways:

  • The scripts folder contains Python scripts for tasks such as downloading documents from the SAP support portal, inserting document embeddings into the database, and managing document data. These scripts can be modified to accommodate different types of documents.
  • The chatbot can be generalized to handle various queries depending on the document types stored in the database.
  • Modify the properties.yml file to adjust various flags and properties (e.g., enabling calls to the LLM without context if call with context does not give relevant response or hana.embeddings.contextSections for document sections to be sent to LLM as part of context) to suit your requirements.

Deployment

Here are a few options for deploying the application:

Future Improvements

  1. SharePoint integration: Users will be able to share SharePoint links to their own documents, which will be stored in the database and used by the LLM to answer queries.
  2. Log file upload interface: Planned alternate chat interface to allow users to upload log file, which will be analysed to fetch error logs and display as dropdown to the user. Users will be able to select specific logs for analysis, and the application will utilize the LLM to display the detailed results for the selected logs. This feature can be accessed using LOG FILE interaction type. It is in development, and currently mock logs are shown for every uploaded file. Screenshots for this feature:
  • Landing screen: 5-log-interaction

  • Dropdown with logs fetched from uploaded file: 6-logs-from-file

  • Chat bot response for the selected log: 7-logs-response

Contributing

We welcome contributions to enhance LogSense AI! Please fork the repository and submit a pull request for any new features or bug fixes.

License

This project is licensed under the MIT License. See the LICENSE file for more details.

About

Streamlit based Gen AI chatbot that responds to user queries from the repository of stored documents

Topics

Resources

License

Stars

Watchers

Forks