hi @yukang2017 @yanwei-li I have already tried this (find_unused_parameters=True), but the same error persists. Do you have any solutions?
if dist_train:
model = nn.parallel.DistributedDataParallel(model, device_ids=[cfg.LOCAL_RANK % torch.cuda.device_count()], find_unused_parameters=True)
hi @yukang2017 @yanwei-li I have already tried this (find_unused_parameters=True), but the same error persists. Do you have any solutions? if dist_train: model = nn.parallel.DistributedDataParallel(model, device_ids=[cfg.LOCAL_RANK % torch.cuda.device_count()], find_unused_parameters=True)