v0.0.3
First Release: https://pypi.org/project/oshepherd/0.0.3/
- Basic API behavior: serving one http endpoint as a copy of the original
generate()
Ollama server endpoint, receiving incoming request parameters, queuing a message with those parameters in RabbitMQ through Celery, waiting for the Celery Task to finish, extracting the returned response from Redis backend, and then responding to http Ollama client the response from a remote Ollama server. - Basic WORKER behavior: Respond to messages queued in RabbitMQ using Celery, fire
generate()
request pointing to local Ollama server within worker instance, also using Ollama python package client, and return response to Redis backend.