Jongchan / attention-module

Official PyTorch code for "BAM: Bottleneck Attention Module (BMVC2018)" and "CBAM: Convolutional Block Attention Module (ECCV2018)"
MIT License
2.06k stars 401 forks source link

Element-wise summation #1

Open maria8899 opened 6 years ago

maria8899 commented 6 years ago

Hi, in your paper you say you are choosing element-wise summation to combine the two branches, however in your code you are using a product

https://github.com/Jongchan/attention-module/blob/1a23ae52aa4669ad655c41fc2bb6957ce5d70f6e/MODELS/bam.py#L48

Could you elaborate on this? Thanks

splinter21 commented 5 years ago

look at the ablation study part in the paper 3 sum gets better performance than prod, maybe you can modify prod to sum.

sankin1770 commented 5 years ago

How accurate is your BAM - based cifari100?