Closed xingwangsfu closed 7 years ago
I am writing a paper to describe how to train the normalized features. You will see it on arxiv in one month.
Codes will also be released after I upload my paper on arxiv.
Good to know. Thanks.
Hi @xingwangsfu , the paper is uploaded https://arxiv.org/abs/1704.06369 and the codes are released https://github.com/happynear/NormFace . Hope it may help you.
Hi,
I'm currently trying to train the center loss face with L2 normalization layer, more specifically, adding a L2 normalization layer after fc5 before feeding it into the last FC layer. However, after adding this L2 normalization layer, softmax loss is decreasing very slow, compared to the one without L2 normalization layer.
Suggested by others in this issue that the initialization after L2 normalization layer should be carefully chosen, I tried uniform, Gaussian and Xavier. Only uniform can make the softmax loss decease a little bit faster, but still very slow.
Do you have an idea how to train the center loss face with L2 normalization layer? I assume you also used this layer during the training of centerloss face with MS-Celeb-1M since I found this layer in your provided prototxt file, although you commented it.
Any suggestion is appreciated.