Skip to content

Commit 940bdd4

Browse files
meng-huiKoh Meng Hui
andauthored
fix: 503 when private gpt gets ollama service (#2104)
When running private gpt with external ollama API, ollama service returns 503 on startup because ollama service (traefik) might not be ready. - Add healthcheck to ollama service to test for connection to external ollama - private-gpt-ollama service depends on ollama being service_healthy Co-authored-by: Koh Meng Hui <kohmh@duck.com>
1 parent 5851b02 commit 940bdd4

File tree

1 file changed

+8
-1
lines changed

1 file changed

+8
-1
lines changed

docker-compose.yaml

Lines changed: 8 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -29,7 +29,8 @@ services:
2929
- ollama-cuda
3030
- ollama-api
3131
depends_on:
32-
- ollama
32+
ollama:
33+
condition: service_healthy
3334

3435
# Private-GPT service for the local mode
3536
# This service builds from a local Dockerfile and runs the application in local mode.
@@ -60,6 +61,12 @@ services:
6061
# This will route requests to the Ollama service based on the profile.
6162
ollama:
6263
image: traefik:v2.10
64+
healthcheck:
65+
test: ["CMD", "sh", "-c", "wget -q --spider http://ollama:11434 || exit 1"]
66+
interval: 10s
67+
retries: 3
68+
start_period: 5s
69+
timeout: 5s
6370
ports:
6471
- "8080:8080"
6572
command:

0 commit comments

Comments
 (0)