感谢提供gguf模型,Mixtral是一个传奇!准备使用 koboldcpp本地推理试试 #14
Dunkelicht
started this conversation in
General
Replies: 1 comment
-
感谢支持。llama.cpp的GGUF格式目前已成为主流大模型推理格式。 iohub/collama |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
终于看到中文Mixtral模型了,正在下载中,
之前看thebloke也推荐了:
https://github.com/LostRuins/koboldcpp
做本地推理,单exe配置起来也容易,下载完毕后试试,先从q4_k试起来,koboldcpp 有自己的llama.cpp fork,需要多测试一下。
期待这个项目能火起来,加油!
Beta Was this translation helpful? Give feedback.
All reactions