Skip to content

practical-llm-pocs/langchain-poc

Repository files navigation

Pipeline Status Coverage Status

Quick Start

Dependencies

  • Git
  • Git LFS (https://git-lfs.com/)
  • (Optional) pyenv (https://github.com/pyenv/pyenv#getting-pyenv)
  • Python >=3.9.16 (pyenv install)
  • poetry (pip install poetry)
  • Python packages (poetry install)
  • Dependencies for custom tool curl_rss_tool (+ html2text + sumy)
    • html2text (sudo apt install html2text)
    • nltk data (poetry run python -m nltk.downloader all)
  • Ollama
    • llama3 model, or other models used

Env

Create a .env.local file, then update with your API keys.

cp .env.example .env.local

Development

Dual CLI/API framework

This project is set up as a dual CLI/API framework, allowing you to use the same core functionality through both a command-line interface and a REST API. It utilizes FastAPI for the API layer and Click for the CLI layer, with shared functions in the src/core directory.

API Examples:

  1. Start a ollama server
ollama serve
  1. Start the API server in separate terminal
poetry run start --reload
  1. Access the API endpoints using a web browser, curl, or a REST client:
curl http://127.0.0.1:3000/hello
curl http://127.0.0.1:3000/hello/Doge

CLI Examples:

  1. Start the ollama server in separate terminal
ollama serve
  1. Run script from shell
# poetry run hello
Hello World!

# poetry run hello Doge
Hello Doge!

Testing

poetry run test

Deployment

Serverless

The project is set up to use serverless on AWS using [default] profile. To install serverless, go to https://www.serverless.com/framework/docs/getting-started#installation

# Deploy changes
serverless deploy

# View deployed endpoints and resources
serverless info

# Invoke deployed functions
serverless invoke

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages