Releases: mgonzs13/llama_ros
Releases · mgonzs13/llama_ros
2.5.0
llama_cli
- launch: command to launch LLMs
- prompt: command to generate responses using a prompt
2.4.2
llama.cpp updated
new localmentor repo
2.4.0
llama.cpp + new GGML_CUDA
LLAMA_CUDA env var to set if llama.cpp is compiled with CUDA
2.3.2
llama.cpp update + more llava yaml
2.3.1
params and prompt files renamed
2.3.0
New bringup format using YAML files with params