Open CharlesJu1 opened 1 year ago
In train_util.py, there is a warning message "Distributed training requires CUDA. ", "Gradients will not be synchronized properly!". But pytorch.distributed suports distributed training on CPUs. So what does this warning message mean?
In train_util.py, there is a warning message "Distributed training requires CUDA. ", "Gradients will not be synchronized properly!". But pytorch.distributed suports distributed training on CPUs. So what does this warning message mean?