irfanICMLL / structure_knowledge_distillation

The official code for the paper 'Structured Knowledge Distillation for Semantic Segmentation'. (CVPR 2019 ORAL) and extension to other tasks.
BSD 2-Clause "Simplified" License
708 stars 103 forks source link

Why the patch size turn out to be half of the original feature map size in CriterionPairWiseforWholeFeatAfterPool? #51

Closed Ssssseason closed 3 years ago

Ssssseason commented 4 years ago

Table 2 shows it performs best when beta=2x2. But in run_train_val.sh, pool-scale is 0.5, thus finally in CriterionPairWiseforWholeFeatAfterPool, the patch size is half of the original feature map size, not 2x2.

DHuiTnut commented 3 years ago

issue31 might be the answer

Ssssseason commented 3 years ago

issue31 might be the answer

Not the same question. That issue is about alpha, but what I concern about is the value of beta

irfanICMLL commented 3 years ago

Beta 22 means we use feature maps in 22 areas as one node. It is implemented by pooling with 0.5.