irfanICMLL / structure_knowledge_distillation

The official code for the paper 'Structured Knowledge Distillation for Semantic Segmentation'. (CVPR 2019 ORAL) and extension to other tasks.
BSD 2-Clause "Simplified" License
701 stars 104 forks source link

Why structure knowledge distillation is effective? #50

Open xiaozhimabing opened 3 years ago

xiaozhimabing commented 3 years ago

I want to know why structure knowledge distillation is effective and how it can be used for regression tasks?How to choose the intermediate feature maps for pair-wise knowledge distillation? Is there anyone can help me for this question?

irfanICMLL commented 3 years ago

Because it considers the correlation among pixels. If the unary part is hard to learn or can not be trained effectively, employing the structure KD will help training. I tend to choose some deeper features. Because abstract semantics makes more sense. Besides, the spatial size is smaller which is more efficient.

xiaozhimabing commented 3 years ago

thank you!!!