greatlog / SWAHR-HumanPose

Bottom-up Human Pose Estimation
127 stars 20 forks source link

Is it necessary to add nn.ReLU(True) in the fuse_layers? #3

Closed jin-s13 closed 3 years ago

jin-s13 commented 3 years ago

Thanks for releasing the code. By comparing SWAHR-HumanPose and HigherHRNet, I notice that some nn.ReLU layers are added in the fuse_layers. I wonder what if we do not modify the original backbone network. Will it affect the final performance?

greatlog commented 3 years ago

No, the ReLU layer is used as a placeholder here, which acts the same as 'None' in the original version. However, since the 'None' object can not be added in a JIT model, we replace it with a ReLU layer.

jin-s13 commented 3 years ago

Thanks for clarification.