openai / guided-diffusion

MIT License
5.9k stars 782 forks source link

is distributed training on multiple CPUs possible? #113

Open CharlesJu1 opened 1 year ago

CharlesJu1 commented 1 year ago

In train_util.py, there is a warning message "Distributed training requires CUDA. ", "Gradients will not be synchronized properly!". But pytorch.distributed suports distributed training on CPUs. So what does this warning message mean?