juntang-zhuang / ShelfNet

implementation for paper "ShelfNet for fast semantic segmentation"
MIT License
252 stars 41 forks source link

shelfnet share weights? #3

Closed freedomsb closed 5 years ago

freedomsb commented 5 years ago

I am pretty appreciate your work. I try to re-implement your work, but when I refer to your pytorch code, I don't find any shared-weights operation but it does mentioned in your paper.
I notice that the shared-weights block should appear in every stage in shelfnet's decoder side and the in/out channels of conv2d are not the same, so how can I share the weights of residual block?

juntang-zhuang commented 5 years ago

Thanks for your interest. The "decoder side" are defined in encoding/models/LadderNet_v6.py, where the module "BasicBlock" is used in many stages. "BasicBlock" is a modification of the standard BasicBlock in ResNet, but it only contains 1 conv layer (self.conv1). self.conv1 is used twice in the forward pass, and this is where the shared-weights work.

juntang-zhuang commented 5 years ago

Just to clarify, in the same file, although Bottleneck is defined, it's in fact never used.

freedomsb commented 5 years ago

Ok, get it. I may misunderstand your concept of "shared-weights residual block". It share weights inside each residual block, but not shared for every residual blocks. Thank you for your quick reply~