Closed zhaoedf closed 2 years ago
What I call attentions
there is simply the ultimate feature maps of a resnet block (mod*
) just before the ReLU.
For here x = ReLU(att)
.
What I call
attentions
there is simply the ultimate feature maps of a resnet block (mod*
) just before the ReLU.For here
x = ReLU(att)
.
oh i understood, thanks
why? at first i thought this might be related to inplace_abn, then i checked the paper of inplace_abn, found no relevant info.