bnu-wangxun / Deep_Metric

Deep Metric Learning
Apache License 2.0
777 stars 148 forks source link

For the losses in your code #23

Open themis0888 opened 5 years ago

themis0888 commented 5 years ago

Thanks for sharing your great work! I read some questions and your answers about the losses, but I ask again to make it sure.
For the losses in your code, I see a similar type of losses such as neg_loss = 2.0/self.alpha torch.log(1 + torch.sum(torch.exp(self.alpha (neg_pair - base)))) from the SemiHard loss, DistWeighted loss. As far as I know, those losses were a linear loss of the distance(in your code, similarity) according to the original papers. (If those losses are from the FaceNet and Sampling matteres...) Are these losses your own 'Weight' loss which has an assumption that the data follows the distribution of the mixture of Gaussian? Thanks :D

bnu-wangxun commented 5 years ago

No such assumption is needed. Weight loss is based on th idea of hard mining : to focus on harder samples, no matter positive or negative pairs.