tengshaofeng / ResidualAttentionNetwork-pytorch

a pytorch code about Residual Attention Network. This code is based on two projects from
681 stars 166 forks source link

What is the meaning of `softmax` in attention_module.py? #30

Open theodoruszq opened 4 years ago

theodoruszq commented 4 years ago

Hi, I am confused about the term softmax_blocks. The term in the paper should be soft mask blocks? I check the ResidualBlock class which does not exist normalization layers.