Skip to content

pytorch-distributed

Pre-release
Pre-release
Compare
Choose a tag to compare
@zjykzj zjykzj released this 09 Feb 12:58
· 3 commits to master since this release
  1. Distributed training using torch.multiprocessing.spawn and nn.parallel.DistributedDataParallel
  2. Mixed precision training using torch.cuda.amp