Skip to content

Latest commit

 

History

History
20 lines (12 loc) · 862 Bytes

README.md

File metadata and controls

20 lines (12 loc) · 862 Bytes

Nio LLM

GitHub Code style: black Ruff

You own little LLM in your matrix chatroom.

Usage

This project is split in two parts: the client and the server.

The server simply downloads an LLM and starts a llama-cpp-python server (which mimics an openai server).

The client connects to the matrix server and queries the llama-cpp-python server to create matrix messages.

Special thanks