Open ghost opened 4 years ago
when use the N pair loss,it is increasing.may be which reason?
@nmll Hard to say without more information. Perhaps your learning rate is too high?
Thanks,my learning rate is decaying.I do not use margin and miner,I use cos distance between the objects and then use the N pair loss,but the loss is increasing, I do not know why.the loss is increasing to 2.0 from 1,3
@nmll What are you passing into the N-pairs loss?
@KevinMusgrave I pass into N-pairs loss the (anchors, labels,(anchor_index,pos_index,neg_index)),is the function required.
@nmll Here are some other things you can try:
Try passing in just "anchors" and "labels":
loss_func(anchors, labels)
See if the same thing happens with another loss function:
from pytorch_metric_learning.losses import ContrastiveLoss
from pytorch_metric_learning.reducers import MeanReducer
loss_func = ContrastiveLoss(reducer=MeanReducer())
3. Start with a lower learning rate.
Thanks a lot!
In the N-pairs loss paper, there is a "Hard negative class mining". But this is not implemented in this repository. Will you make this mining strategy? I just wonder it.