arthurdouillard / CVPR2021_PLOP

Official code of CVPR 2021's PLOP: Learning without Forgetting for Continual Semantic Segmentation
https://arxiv.org/abs/2011.11390
MIT License
145 stars 23 forks source link

why there are attentions in ResNet? #30

Closed zhaoedf closed 2 years ago

zhaoedf commented 2 years ago

image

why? at first i thought this might be related to inplace_abn, then i checked the paper of inplace_abn, found no relevant info.

arthurdouillard commented 2 years ago

What I call attentions there is simply the ultimate feature maps of a resnet block (mod*) just before the ReLU.

For here x = ReLU(att).

zhaoedf commented 2 years ago

What I call attentions there is simply the ultimate feature maps of a resnet block (mod*) just before the ReLU.

For here x = ReLU(att).

oh i understood, thanks