ArchipLab-LinfengZhang / Object-Detection-Knowledge-Distillation-ICLR2021

The official implementation of ICLR2021 paper "Improve Object Detection with Feature-based Knowledge Distillation: Towards Accurate and Efficient Detectors".
MIT License
57 stars 6 forks source link

Softmax is missing in the non-local module #11

Open HIT-cwh opened 2 years ago

HIT-cwh commented 2 years ago

First of all, thank you for sharing your valuable code. As shown in fig2(c) of the paper, the softmax function is used to convert the matrix product to the dot-product attention. But it is missing in the NonLocalBlockND of the released code. Moreover, it may be time-consuming to obtain this attention matrix as the matrix size is (300x300, 150x150). Looking forward to your reply. Thanks.