- Git
- Git LFS (https://git-lfs.com/)
- (Optional) pyenv (https://github.com/pyenv/pyenv#getting-pyenv)
- Python >=3.9.16 (
pyenv install) - poetry (
pip install poetry) - Python packages (
poetry install) - Dependencies for custom tool curl_rss_tool (+ html2text + sumy)
- html2text (
sudo apt install html2text) - nltk data (
poetry run python -m nltk.downloader all)
- html2text (
- Ollama
- llama3 model, or other models used
Create a .env.local file, then update with your API keys.
cp .env.example .env.localThis project is set up as a dual CLI/API framework,
allowing you to use the same core functionality through both a command-line interface and a REST API.
It utilizes FastAPI for the API layer and Click for the CLI layer, with shared functions in the src/core directory.
- Start a ollama server
ollama serve- Start the API server in separate terminal
poetry run start --reload- Access the API endpoints using a web browser, curl, or a REST client:
curl http://127.0.0.1:3000/hello
curl http://127.0.0.1:3000/hello/Doge- Start the ollama server in separate terminal
ollama serve- Run script from shell
# poetry run hello
Hello World!
# poetry run hello Doge
Hello Doge!poetry run testThe project is set up to use serverless on AWS using [default] profile. To install serverless, go to https://www.serverless.com/framework/docs/getting-started#installation
# Deploy changes
serverless deploy
# View deployed endpoints and resources
serverless info
# Invoke deployed functions
serverless invoke