megvii-research / mdistiller

The official implementation of [CVPR2022] Decoupled Knowledge Distillation https://arxiv.org/abs/2203.08679 and [ICCV2023] DOT: A Distillation-Oriented Trainer https://openaccess.thecvf.com/content/ICCV2023/papers/Zhao_DOT_A_Distillation-Oriented_Trainer_ICCV_2023_paper.pdf
808 stars 123 forks source link

Training ResNet weights on custom dataset for further distilling #46

Closed MR-hyj closed 1 year ago

MR-hyj commented 1 year ago

Given a custom object detection dataset in coco format, I would like to re-train a ResNet101 as a teacher model, and a ResNet18 as a student model for further distilling. But I don't see how to turn off the distillation in this repo.

Any methods for only training detection models?

Zzzzz1 commented 1 year ago

You could use detectron2 to train the teacher model.