ChenhongyiYang / PGD

[ECCV 2022] Prediction-Guided Distillation for Dense Object Detection
MIT License
63 stars 12 forks source link

Configs of Distilling to a Lightweight Backbone. #2

Closed 0russ closed 2 years ago

0russ commented 2 years ago

Hello, thanks for sharing your excellent work on detection distillation! I am very impressed with your paper and I found that your distillation method achieves outstanding performance on the lightweight backbone. I wonder if you can provide the training config of distilling to a lightweight backbone, thanks a lot!

image

ChenhongyiYang commented 2 years ago

Hello. Thank you for trying our code. We have updated the configs for MobileNetv2.

Please check: pgd_atss_r101_mb2_1x.py

0russ commented 2 years ago

Hello. Thank you for trying our code. We have updated the configs for MobileNetv2.

Please check: pgd_atss_r101_mb2_1x.py

Thanks for your reply! But I still have two questions:

  1. mbv2 student and r101 teacher have different in_channel numbers at FPN, how do you handle this situation?
  2. Is the distillation setting for fcos_mbv2 the same as atss_mbv2, could you please update its training config as well?