Skip to content

Releases: mgonzs13/llama_ros

2.5.3

09 Jul 19:21
Compare
Choose a tag to compare

llama.cpp updated (new detokenize function)

2.5.2

05 Jul 16:14
Compare
Choose a tag to compare

llama.cpp updated

2.5.1

04 Jul 18:17
Compare
Choose a tag to compare

llama.cpp updated and minor fixes for llama_cli

2.5.0

03 Jul 21:04
Compare
Choose a tag to compare

llama_cli

  • launch: command to launch LLMs
  • prompt: command to generate responses using a prompt

2.4.2

03 Jul 10:54
Compare
Choose a tag to compare

llama.cpp updated
new localmentor repo

2.4.1

02 Jul 07:56
Compare
Choose a tag to compare

llama.cpp updated

2.4.0

29 Jun 20:36
Compare
Choose a tag to compare

llama.cpp + new GGML_CUDA
LLAMA_CUDA env var to set if llama.cpp is compiled with CUDA

2.3.2

20 Jun 20:17
Compare
Choose a tag to compare

llama.cpp update + more llava yaml

2.3.1

18 Jun 09:02
Compare
Choose a tag to compare

params and prompt files renamed

2.3.0

17 Jun 11:34
Compare
Choose a tag to compare

New bringup format using YAML files with params