LLM story writer with a focus on high-quality long output based on a user provided prompt.
-
Updated
Oct 31, 2024 - Python
LLM story writer with a focus on high-quality long output based on a user provided prompt.
LLMs prompt augmentation with RAG by integrating external custom data from a variety of sources, allowing chat with such documents
Use your open source local model from the terminal
(Work in Progress) A cross-platform desktop client for offline LlaMA-CPU
MathMate | Multi-Modal AI for Mathematical Learning
50-line local LLM assistant in Python with Streamlit and GPT4All
Add a description, image, and links to the local-llama topic page so that developers can more easily learn about it.
To associate your repository with the local-llama topic, visit your repo's landing page and select "manage topics."