Closed PES2g closed 5 years ago
Hi, the end2end loss performs poorly in my experiment (compared with softmax or amsoftmax). I suspect that this loss is not a good option when the dataset is not too large
(#number of speakers is small). So I just commented on the related code.
I saw code for GE2E loss in the code which have already been commented. Do you guys try GE2E loss in the experiment ? If yes, what is the performance of GE2E loss ?
End-to-End requires large training set(greater than 10k individual identities at least I think) which isn't so easy to obtain. So no, the GE2E loss won't work too well with a small dataset.
I saw code for GE2E loss in the code which have already been commented. Do you guys try GE2E loss in the experiment ? If yes, what is the performance of GE2E loss ?