- Docker: Ensure you have Docker installed and running.
- Nvidia GPU: Required to run the model.
- Windows Operating System: The tested environment for this setup.
Run the following commands to clone the necessary repositories:
git clone git@github.com:Kreative-Performative-Individuals/KB.git
git clone git@github.com:Kreative-Performative-Individuals/smart-industrial-database.git
git clone git@github.com:Kreative-Performative-Individuals/data-preprocessing-.git
git clone git@github.com:Kreative-Performative-Individuals/RAG5.git
git clone git@github.com:Kreative-Performative-Individuals/KPI-Engine.git
git clone git@github.com:Kreative-Performative-Individuals/frontend.git
-
Build and start all services:
docker compose up --build
-
Stop all services:
docker compose down
-
Rebuild and start again (the first build may take some time):
docker compose up --build
Ensure that all services communicate properly after the initial setup.
To test the GUI, follow these steps:
-
Navigate to the
frontend
folder:cd frontend
-
Build the frontend Docker container:
docker build -t frontend .
-
Run the container:
docker run -d --name frontend -p 3000:3000 frontend
-
Open your browser and visit http://localhost:3000 to interact with the GUI.
-
Resource-Heavy Components:
- Comment out configurations related to RAG and Ollama if you don't need the rag since it's heavy both on disk and on computation.
- Running Ollama requires an Nvidia GPU, and it may not function correctly on Linux, even with proprietary drivers.
-
Database Initialization:
- If the database fails to start, rerun the compose command to ensure the initialization script executes properly:
docker compose up --build
- If the database fails to start, rerun the compose command to ensure the initialization script executes properly:
Happy coding! 🚀