tengshaofeng / ResidualAttentionNetwork-pytorch

a pytorch code about Residual Attention Network. This code is based on two projects from
667 stars 165 forks source link

What is the meaning of `softmax` in attention_module.py? #30

Open sydney0zq opened 4 years ago

sydney0zq commented 4 years ago

Hi, I am confused about the term softmax_blocks. The term in the paper should be soft mask blocks? I check the ResidualBlock class which does not exist normalization layers.