Closed huanghoujing closed 6 years ago
Hi, Tong Xiao.
In the forward function of ResNet or InceptionNet, I am confused by the following two lines:
forward
ResNet
InceptionNet
if self.norm: x = F.normalize(x) elif self.has_embedding: x = F.relu(x)
Why should normalize and relu be exclusive to each other? Should the elif here be if instead?
normalize
relu
elif
if
Waiting for your response. Thanks a lot!
Normalization will make features unit vector. ReLU will violate such constraints.
Thank you! I understand it now.
Hi, Tong Xiao.
In the
forward
function ofResNet
orInceptionNet
, I am confused by the following two lines:Why should
normalize
andrelu
be exclusive to each other? Should theelif
here beif
instead?Waiting for your response. Thanks a lot!