Open batrlatom opened 5 years ago
Don't see anything wrong.
If the batch all loss works, and the batch hard triplet loss does not, this might indicate that your dataset is a bit noisy so hard triplets are mislabeled.
You can also train first with batch all, then finetune at the end with batch hard.
Hello. I am unable to make work hard triplet mining and balanced batches. I think that we had a discussion about it here, but so far I think embedding are always collapsing into one point. I tried many combinations of "margins", "num_classes_per_batch" , "num_images_per_class". But nothing seems to work. Could you please take a look at the code if there is some obvious problem? Noting that with batch_all strategy, it works well. Thanks, Tom