Releases: mgonzs13/llama_ros
Releases · mgonzs13/llama_ros
2.4.0
llama.cpp + new GGML_CUDA
LLAMA_CUDA env var to set if llama.cpp is compiled with CUDA
2.3.2
llama.cpp update + more llava yaml
2.3.1
params and prompt files renamed
2.3.0
New bringup format using YAML files with params
2.2.2
stopping words fixed, replacing "\n" by "\n"
2.2.0
New params and configurations for create_llama_launch:
- system_prompt
- system_prompt_file
- system_prompt_type
- model
- lora_base
- mmproj