HobbitLong / SupContrast

PyTorch implementation of "Supervised Contrastive Learning" (and SimCLR incidentally)
BSD 2-Clause "Simplified" License
3.08k stars 528 forks source link

Why use logits without exp() and log() - torch.log(exp_logits) #101

Open Zhen-Zohn-WANG opened 2 years ago

Zhen-Zohn-WANG commented 2 years ago
    # compute log_prob
    exp_logits = torch.exp(logits) * logits_mask
    log_prob = logits - torch.log(exp_logits.sum(1, keepdim=True))

In the code of loss, the logits without exp() and log(), why it can minus a log_exp_logit? It is different with Eq.(2) of SupContrast paper.

Thank you for your help.

thomascong121 commented 2 years ago

Because if you take the log over the exp, it gives the original value of logits.

QishengL commented 2 years ago

I still did not understand the calculation of loss. Did you figure it out? Can anyone explain a little bit more?

xfreppihs commented 1 year ago

I still did not understand the calculation of loss. Did you figure it out? Can anyone explain a little bit more?

log(a/b) = log(a)-log(b)