Open meijieru opened 5 years ago
model = nn.Linear(10, 2).cuda() torch_wrapper = torch.data.parallel.DistributedDataParallel(model) apex_wrapper = apex.parallel.DistributedDataParallel(model) lhs = copy.deepcopy(torch_wrapper).module # ok rhs = copy.deepcopy(apex_wrapper).module # failed, AttributeError: 'DistributedDataParallel' object has no attribute 'module'
@mcarilli any ideas on this ? I also face the same problem, I know torch DDP is recommended now, but any workarounds for this ?
Check out https://discuss.pytorch.org/t/torch-cuda-amp-vs-nvidia-apex/74994/2