megvii-research / mdistiller

The official implementation of [CVPR2022] Decoupled Knowledge Distillation https://arxiv.org/abs/2203.08679 and [ICCV2023] DOT: A Distillation-Oriented Trainer https://openaccess.thecvf.com/content/ICCV2023/papers/Zhao_DOT_A_Distillation-Oriented_Trainer_ICCV_2023_paper.pdf
807 stars 123 forks source link

Is Distributed Training Supported? #48

Closed paganpasta closed 1 year ago

paganpasta commented 1 year ago

Hi,

Thank you for the repository, it makes really easy for the user to try out different approaches. One question, does this repo support distributed training? If so, how do I run it in DDP mode?

Thanks.

Zzzzz1 commented 1 year ago

Not supported yet.