Closed jin-s13 closed 3 years ago
No, the ReLU layer is used as a placeholder here, which acts the same as 'None' in the original version. However, since the 'None' object can not be added in a JIT model, we replace it with a ReLU layer.
Thanks for clarification.
Thanks for releasing the code. By comparing SWAHR-HumanPose and HigherHRNet, I notice that some nn.ReLU layers are added in the fuse_layers. I wonder what if we do not modify the original backbone network. Will it affect the final performance?