Joker316701882 / Additive-Margin-Softmax

This is the implementation of paper <Additive Margin Softmax for Face Verification>
492 stars 149 forks source link

acc on lfw cannot be imporoved over 0.9 #20

Closed Carbord closed 5 years ago

Carbord commented 5 years ago

Hello, I have tried several combinations of parameters while running train.py, such as epoch_size =200, max_epoch_num =500, keep_probability = 0.9. And the weight_decay and batch_size are set to 5e-4 and 256 as suggested. But the accuracy is around 0.9 on the lfw dataset. So I hope for some advice from you by setting training parameters . What'more, I wonder whether it is because that I have not used any per-trained model. I'm gratitude to wait for your reply soon.

Joker316701882 commented 5 years ago

@Carbord Well it shouldn't be if you follow all instructions in README.md. Perhaps you didn't align lfw, or you didn't train enough time.

wxl3d commented 5 years ago

hi, i want to know which algorithm used to align lfw. i just use the mctnn to crop the train sample and lfw without other operation. But i can't get the high acc with additive margin softmax loss. Wait for your advice urgently! Many thanks!

Carbord commented 5 years ago

hi, i want to know which algorithm used to align lfw. i just use the mctnn to crop the train sample and lfw without other operation. But i can't get the high acc with additive margin softmax loss. Wait for your advice urgently! Many thanks!

I just follow all instructions in README.md, thus I use align_lfw.py to align my dataset.