cvqluu / Angular-Penalty-Softmax-Losses-Pytorch

Angular penalty loss functions in Pytorch (ArcFace, SphereFace, Additive Margin, CosFace)
MIT License
481 stars 92 forks source link

Normalizing the layer weights in loss function has no effect #13

Closed ehalit closed 3 years ago

ehalit commented 3 years ago

Hello!

The parameters of the fully connected layer in the loss function are normalized in the following way:

for W in self.fc.parameters():
    W = F.normalize(W, p=2, dim=1)

However, the weights of self.fc are not effected from this operation, I checked it with print(self.fc.weight) and print(W). This means that the cosine calculation is actually conducted with non-normalized vectors.

ehalit commented 3 years ago

This seems to be a duplicate issue, see the original one.