irfanICMLL / structure_knowledge_distillation

The official code for the paper 'Structured Knowledge Distillation for Semantic Segmentation'. (CVPR 2019 ORAL) and extension to other tasks.
BSD 2-Clause "Simplified" License
695 stars 104 forks source link

Can‘t reproduce the results for ’MobileNetv2Plus ‘ #13

Closed idealwei closed 4 years ago

idealwei commented 4 years ago

Hi, I kept the default setting of this code and replace the student model with MobileNetV2Plus in LightNet. However, the results with distillation is even worse than that without. Can you show me more details about how to train with MobileNetV2Plus ? It would be very helpful.

irfanICMLL commented 4 years ago

You can use the original training settings in LightNet. Or you can try to adjust the loss weight in your situation. It is wired that the distillation results are worse. We always get a better results with the distillation items.