tengshaofeng / ResidualAttentionNetwork-pytorch

a pytorch code about Residual Attention Network. This code is based on two projects from
667 stars 165 forks source link

A Inputsize Question #29

Closed onlyonewater closed 4 years ago

onlyonewater commented 4 years ago

Hi @tengshaofeng ,thanks ,But I have a question,in attention_module.py ,the class AttentionModule_stage0 inputsize is 112112,but in the class AttentionModule_stage1 the inputsize is 5656,is any maxpool layer used in the middle?I think it's not mentioned in the paper.

onlyonewater commented 4 years ago

Sorry,I konw this this question ,please close this issue.Thanks!