This project is a chat interface for Ollama, built with Astro and packaged in a Docker container for easy deployment.
- Docker
- Kubernetes cluster (optional, for Kubernetes deployment)
- Local node environment
-
Clone the repository:
git clone <repository-url> cd ollama-chat-interface
-
Create and activate node environment
./create-nodeenv.sh .source local-ollama-nodeenv/bin/activate
-
Install dependencies:
npm install
-
Run the development server:
npm run dev
-
Open
http://localhost:3000
in your browser.
You can use the docker-build-image.sh docker-run-container-sh and docker-stop-container.sh or do it manually:
-
Build the Docker image:
docker build -t ollama-chat-interface .
-
Run the Docker container:
docker run -d -p 3000:3000 --name ollama-chat-interface ollama-chat-interface
-
Access the application at
http://localhost:3000
.
To stop the container:
docker stop ollama-chat-interface
- Ensure your Kubernetes cluster is set up and
kubectl
is configured.
You can use the kubernetes-apply-manifest.sh or run manually:
-
Apply the Kubernetes configurations:
kubectl apply -f kubernetes-deployment.yaml kubectl apply -f kubernetes-service.yaml
-
Check the status of your deployment and service:
kubectl get deployments kubectl get services