The official implementation of [CVPR2022] Decoupled Knowledge Distillation https://arxiv.org/abs/2203.08679 and [ICCV2023] DOT: A Distillation-Oriented Trainer https://openaccess.thecvf.com/content/ICCV2023/papers/Zhao_DOT_A_Distillation-Oriented_Trainer_ICCV_2023_paper.pdf
Thank you for the repository, it makes really easy for the user to try out different approaches. One question, does this repo support distributed training? If so, how do I run it in DDP mode?
Hi,
Thank you for the repository, it makes really easy for the user to try out different approaches. One question, does this repo support distributed training? If so, how do I run it in DDP mode?
Thanks.