Releases: mgonzs13/llama_ros
4.2.0
Changelog from version 4.1.8 to 4.2.0:
212f5f1 new version 4.2.0
40121ba new line removed from llava
46f3f05 new reranking and embeddings
1292f11 llama.cpp updated
c68761a cleaning chat llama ros
7f9d184 minor fixes to goal in execute
43173bb not reset image in llava
490f4a0 minor fix in README
b88d44b minor fixes
d309f5c minor fix in llava
56cd9d0 llama.cpp updated
4f203c5 demos and examples fixed in README
8292411 fixing python imports in demos
566a736 fixing rag demo
8a01839 chatllama_tools_node renamed to chatllama_tools_demo_node
b8ed82a updating langchain versions
4abf2d0 sorting python imports
bfa6e2a updating chroma version
8c191ee moving get_metada service - embedding and rerank models will not have get_metada service
51fff6c fixing rerank by setting nomalization to -1
2eab6d6 LangChain Tools on Chat (#12)
b037ad3 llama.cpp updated
17b8719 phi-4 added
d1a24c4 new embedding models
4.1.8
4.1.7
Changelog from version 4.1.6 to 4.1.7:
f0a0af9 new version 4.1.7
efd8c97 fixing license comments
8275fe4 ifndef guard names fixed
eff180d new llama logs
60187e4 huggingface-hub upgraded to 0.27.0
44bbb71 Falcon3 example added
9c2ac6d llama.cpp updated
beb4a22 llama.cpp updated
741b01e vendro C++ standard set to 17
39db082 llama.cpp updated + new penalize sampling
8bf89d9 Update README.md
2b9a0df workflow names fixed
4.1.6
4.1.5
4.1.4
4.1.3
4.1.2
4.1.1
- minor fixes to langchain integration
- find_stop_word fixed by trimming string
- llama.cpp b4154