-
Notifications
You must be signed in to change notification settings - Fork 66
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
单机多卡训练 #101
Comments
支持的,各种单机多卡的策略都可以支持 |
大佬您好,我想问下可以单击多卡进行finetuing吗,我在FineTuner这个类没有找到gpu设置的参数项。 |
单机多卡的 finetuning 是通过 accelerate 包来支持的,你需要参考 accelerate 的方式来使用 |
你好,请问给出一个单机多卡的 finetuning 的Demo代码吗?谢谢! |
@yangxudong 提供一个参考案例给你,希望有帮助, 因为只是简单体验uniem,可能有理解不对的地方。假设你已经跑通单机finetune的情况下,
到这里应该能跑起来了,当然你可能还会遇上pytorch版本问题,分布式跑也可能有坑,见招拆招吧。 |
🐛 bug 说明
请问微调模型是否支持单机多卡
Python Version
None
The text was updated successfully, but these errors were encountered: