Skip to content

Releases: zjykzj/pytorch-distributed

pytorch-distributed

09 Feb 12:58

Choose a tag to compare

pytorch-distributed Pre-release
Pre-release
  1. Distributed training using torch.multiprocessing.spawn and nn.parallel.DistributedDataParallel
  2. Mixed precision training using torch.cuda.amp