tengshaofeng / ResidualAttentionNetwork-pytorch

a pytorch code about Residual Attention Network. This code is based on two projects from
681 stars 166 forks source link

How to generate the masks given in the paper? #7

Closed jain-avi closed 6 years ago

jain-avi commented 6 years ago

I have a trained residual attention model, and I want to visualize the masks given in Figure 1. Any idea how do the authors do that? @tengshaofeng If u have already done it, can u share the code to actually visualize the attention masks?

tengshaofeng commented 6 years ago

I have not done that. but I think you can visualize the attention map such as self.conv1_1_blocks in AttentionModule_stage1_cifar.

tengshaofeng commented 6 years ago

for the equation" out = (1 + out_conv1_1_blocks) * out_trunk ", I think the feature before mask is out_trunk , and attention mask is out_conv1_1_blocks, and feature after mask is out .