You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have a dataset that's loaded with LinkNeighborLoader for a model that we have to do link prediction. I am trying to wrap this model in DistributedDataParallel to improve model training.
I understand so far that one of the main cornerstones in successfully applying DistributedDataParallel is using the DistributedSampler. However, after looking through the source code, it doesn't look like I can apply DistributedSampler to LinkNeighborLoader.
I am now thinking of self-partitioning the graph in a way that each process has it's own subset of the graph.
Has anyone had any experience in successfully wrapping a model using LinkNeighborLoader with DistributedDataParallel?
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
I have a dataset that's loaded with
LinkNeighborLoader
for a model that we have to do link prediction. I am trying to wrap this model inDistributedDataParallel
to improve model training.I understand so far that one of the main cornerstones in successfully applying
DistributedDataParallel
is using theDistributedSampler
. However, after looking through the source code, it doesn't look like I can applyDistributedSampler
toLinkNeighborLoader
.I am now thinking of self-partitioning the graph in a way that each process has it's own subset of the graph.
Has anyone had any experience in successfully wrapping a model using
LinkNeighborLoader
withDistributedDataParallel
?Beta Was this translation helpful? Give feedback.
All reactions