-
Notifications
You must be signed in to change notification settings - Fork 5.8k
执行llamafactory-cli webui,报错 #7520
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
我也这样,前两个小时还好好的,估计他们服务端出问题了 |
我解决了,执行以下命令,你可以试试 |
我也是,一样的服务器配置和驱动,以及CUDA+PYTORCH,一样的安装流程。就从昨天开始就各种报这个错 |
感谢,已经解决了 |
关闭代理 |
3Q 😀 |
好像是最新版本有的包出问题了,我老版本的llamafactory可以使用(3月26号下载的),今天重新配了一个就不能用了,通过这个命令解决了 |
感谢大佬,解决了! |
牛哇牛哇 |
3q |
fixed |
Reminder
System Info
执行llamafactory-cli webui,报错ValueError: When localhost is not accessible, a shareable link must be created. Please set share=True or check your proxy settings to allow access to localhost.请大佬们执教
Reproduction
Others
No response
The text was updated successfully, but these errors were encountered: