-
Notifications
You must be signed in to change notification settings - Fork 496
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
模型不支持 Model not supported, name: fuzimingcha_v1.0, format: pytorch, size: 6, quantization: 8-bit #1430
Comments
注册自定义模型了吗? |
使用过注册自定义模型的办法,但是报了另外一个错误。在[https://github.com/irlab-sdu/fuzi.mingcha/issues/13]中给出了使用AutoModel类的解决方法,但是xinference源代码里面是使用的是AutoModelForCausalLM类,所以出现了这个错误。
|
This issue is stale because it has been open for 7 days with no activity. |
This issue was closed because it has been inactive for 5 days since being marked as stale. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
使用基于ChatGLM-6B的夫子明察法律大模型,部署出现问题。
xinference=0.10.3
报错如下,使用任何量化方式都有该问题
The text was updated successfully, but these errors were encountered: