Joker316701882 / Additive-Margin-Softmax

This is the implementation of paper <Additive Margin Softmax for Face Verification>
492 stars 149 forks source link

Does batch normalization increase the accuracy? #14

Closed twmht closed 5 years ago

twmht commented 6 years ago

Hi,

I've tried batch normalization with resnet20 in caffe. But I found out the test accuracy drops.

Do you compare the accuracy with or w/o Batch normalization?

Joker316701882 commented 6 years ago

@twmht Sorry, not yet.

twmht commented 6 years ago

As you suggested. I am trying ADAM now. Any feedback will be reported here.

Joker316701882 commented 5 years ago

@twmht Hey, I updated this repo yesterday, some bugs are fixed and the conclusion before may have some problem. I found that Mom is better in face recognition. And with resface20(bn), lfw reaches 99.5%, I think batch norm actually increase the performance compared to original paper. Further more, in deeper models, batch norm is a standard configuration. So I think just use it.

twmht commented 5 years ago

@Joker316701882

Hi, So you just append batch norm in each convolution of resnet20, and you don't append the batch norm after the fully connection (which is used as the embedding)?