Closed xutting123 closed 3 years ago
Hi, I have read your paper and code and found it to be an interesting work!
I have a question. In your code,
loss2 = mu * criterion(logits, labels) # (main.py line311).
I know it is uesd to caculate the con_loss (Eq.3 in your paper), but why it is implemented by cross-entropy loss with labels (zeros tensor)?
Hi @xutting123 ,
The contrastive loss can be computed by the cross-entropy loss. In the formula, by setting labels to 0, x[class] (i.e., x[0]) is the positive pair part.
Hi, I have read your paper and code and found it to be an interesting work!
I have a question. In your code,
I know it is uesd to caculate the con_loss (Eq.3 in your paper), but why it is implemented by cross-entropy loss with labels (zeros tensor)?