Closed RalphMao closed 3 years ago
PR 19 has the issue when actually calling the DistributedDataParallel module. Calling torch.distributed.init_process_group with a dummy master_host can solve this issue.
torch.distributed.init_process_group
PR 19 has the issue when actually calling the DistributedDataParallel module. Calling
torch.distributed.init_process_group
with a dummy master_host can solve this issue.