Open tonybaigang opened 7 years ago
Obviously, it is very hard to train triplet loss nn. Until now, except for the original google facenet, I have not seen any case with very sussuccful triplet loss training.
So I think the performance it is reasonable although I have not trained it.
Hi I am a neophyte,I don't understand why batch_number exceeds epoch_size.
Yes, I get 89% Accuracy.
@tonybaigang please look at the wikis, David mentions this
I am doing the training and i follow the wiki of Triplet loss training,but i do not know how to get the data ~/datasets/casia/casia_maxpy_mtcnnalign_182_160,i aligned the casia data to 160 ,but get error: Input to reshape is a tensor with 4096 values, but the requested shape requires a multiple of 384. so i hope someone can help me how to do the training
I had try to train the model by myself. I download CASIA-WebFace which had been celan and then follow the instructions in wiki "Triplet loss training" to train the mode. But it seems the accuracy on LFW, is %88, is it right?