LogSense AI is a generative AI chatbot built on Streamlit that effectively addresses user queries by utilizing a vector-based document database. It retrieves the most relevant documents using cosine similarity search for each user inquiry. By employing the Retrieval-Augmented Generation (RAG) framework, the retrieved documents serve as context for generating responses through a large language model (LLM) using LangChain. Currently, the documents stored in the database consist of Knowledge Base Articles (KBAs) related to SAP Commerce Cloud, sourced from SAP's support portal. Please note that these documents are accessible only within SAP's network and are not included in this repository.
- Architecture
- Setup Instructions
- Usage
- Screenshots
- Customizations
- Deployment
- Future Improvements
- Contributing
- License
The application connects to the LLM via an SAP-provided proxy as well as directly through LangChain. Below is the architecture diagram illustrating the connection through the SAP proxy:
After cloning the repository, follow these steps to set up the project:
-
Set up your local development Environment:
- Download and install JetBrains PyCharm IDE or your preferred IDE.
- The following instructions will focus on PyCharm, but most IDEs provide similar features.
-
Open the project:
- In PyCharm, navigate to
File -> Open
and select the cloned repository folder.
- In PyCharm, navigate to
-
Set up a local virtual environment:
- Go to
Settings
>Project: LogSenseAI
>Python Interpreter
>Add Interpreter
. - Choose
Add Local Interpreter
>Virtualenv Environment
.- Select
Environment
->New
. - Set
Base Interpreter
to your installed Python version (e.g., Python 3.x). - Click
OK
.
- Select
- Go to
-
Install dependencies:
- Run the following commands in your terminal:
pip install "generative-ai-hub-sdk[all]==1.2.2" --extra-index-url https://int.repositories.cloud.sap/artifactory/api/pypi/proxy-deploy-releases-hyperspace-pypi/simple/ pip install -r requirements.txt
- If you prefer to connect directly to the LLM via OpenAI instead of using the SAP proxy, you can skip the installation of
generative-ai-hub-sdk
and install required LangChain libraries instead.
- Run the following commands in your terminal:
-
Run the application:
streamlit run app.py
-
Access the user interface: Open your web browser and navigate to http://localhost:8500/ (or the appropriate port if different).
Once the application is running, you can interact with LogSense AI by entering your queries in the provided interface. The chatbot will retrieve relevant information from the document database and generate responses based on the context.
Here are some screenshots showcasing working deployments of the application.
-
Chat inteerface selection (See Future Improvements for details of LOG FILE interaction type):
-
[CHAT interaction type] Query for which response is available from stored documents. Note that response also shares link to source documents:
-
[CHAT interaction type] Query for which response is not available from stored documents:
You can easily customize the application in the following ways:
- The
scripts
folder contains Python scripts for tasks such as downloading documents from the SAP support portal, inserting document embeddings into the database, and managing document data. These scripts can be modified to accommodate different types of documents. - The chatbot can be generalized to handle various queries depending on the document types stored in the database.
- Modify the
properties.yml
file to adjust various flags and properties (e.g., enabling calls to the LLM without context if call with context does not give relevant response or hana.embeddings.contextSections for document sections to be sent to LLM as part of context) to suit your requirements.
Here are a few options for deploying the application:
- Streamlit Community Cloud
- Cloud Foundry: Relevant files for deployment through SAP BTP are included in the repository.
- SharePoint integration: Users will be able to share SharePoint links to their own documents, which will be stored in the database and used by the LLM to answer queries.
- Log file upload interface: Planned alternate chat interface to allow users to upload log file, which will be analysed to fetch error logs and display as dropdown to the user. Users will be able to select specific logs for analysis, and the application will utilize the LLM to display the detailed results for the selected logs. This feature can be accessed using LOG FILE interaction type. It is in development, and currently mock logs are shown for every uploaded file. Screenshots for this feature:
We welcome contributions to enhance LogSense AI! Please fork the repository and submit a pull request for any new features or bug fixes.
This project is licensed under the MIT License. See the LICENSE file for more details.