Closed CYYJL closed 3 months ago
We currently have no distributed training implemented. If you are interested in doing so, I would advice you to follow the official PyTorch Tutorial: https://pytorch.org/tutorials/intermediate/ddp_tutorial.html?utm_source=distr_landing&utm_medium=intermediate_ddp_tutorial
Hi, if I want to conduct distributed training, how can I implement it in your code?