maml 方法似乎不支持多gpu训练 #59
ypy516478793
started this conversation in
General
Replies: 4 comments
-
你好,感谢你的反馈,我们正在解决这个问题,会尽快回复。 |
Beta Was this translation helpful? Give feedback.
0 replies
-
你好,关于你说的maml方法多gpu的问题,我们发现确实存在这样的问题。并且如果要修改支持多gpu的话,需要对代码进行较大的改动。我们打算在之后进行一次更新,来修复这些比较大的问题。 |
Beta Was this translation helpful? Give feedback.
0 replies
-
好的,谢谢! |
Beta Was this translation helpful? Give feedback.
0 replies
-
MAML现在可以多gpu进行训练了。 有一个没有解决的问题是MAML在DistributedDataParallel下不能和SyncBatchNorm同时使用,我们后续会分析缺少同步操作对最终结果的影响,并寻找相应的解决办法。 |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
maml方法能在单个gpu上训练,但在多个gpu上平行训练会报错。具体错误如下:
Beta Was this translation helpful? Give feedback.
All reactions