cvqluu / Angular-Penalty-Softmax-Losses-Pytorch

Angular penalty loss functions in Pytorch (ArcFace, SphereFace, Additive Margin, CosFace)
MIT License
481 stars 92 forks source link

Question about last layer in loss function #12

Open brianw0924 opened 3 years ago

brianw0924 commented 3 years ago

I know it have to normalize the weight

but why do we need this line:

x = F.normalize(x, p=2, dim=1)

why normalize feature?

Charlyo commented 2 years ago

The idea is to optimize the embedding for cosine similarity

https://arxiv.org/pdf/1811.12649.pdf

", we describe below techniques we used to achieve SOTA on the retrieval tasks including L2 normalization of embedding to optimize for cosine similarity"