tengshaofeng / ResidualAttentionNetwork-pytorch

a pytorch code about Residual Attention Network. This code is based on two projects from
667 stars 165 forks source link

请问在AttentionModule_stage1_cifar函数中原论文结构这里没在上采样后加 out_trunk这一步骤吧 如下 #11

Closed sankin1770 closed 5 years ago

sankin1770 commented 5 years ago

out_interp2 = self.interpolation2(out_up_residual_blocks1) + out_trunk

tengshaofeng commented 5 years ago

恩这个没说,这个是参考caffe版本,他这个也是参考fcn的,你可以去掉跑一跑试一试,记得回来反馈哦

manmanCover commented 5 years ago

恩这个没说,这个是参考caffe版本,他这个也是参考fcn的,你可以去掉跑一跑试一试,记得回来反馈哦

https://github.com/fwang91/caffe/blob/d0f0f9586fbb8ff8b084cc7dbb834d52f6d24bf0/src/caffe/layers/interp.cpp

我看了一下caffe源码,应该是把ouput插值成trunk的大小,而不是element-wise sum的意思。

tengshaofeng commented 5 years ago

@manmanCover ,你的意思是caffe中实际上是没有加这个操作咯?没加的话效果如何?