ViLab-UCSD / MemSAC_ECCV2022

PyTorch code for MemSAC. To appear in ECCV 2022.
https://tarun005.github.io/MemSAC/
MIT License
7 stars 0 forks source link

loss function #2

Closed onkarkris closed 1 year ago

onkarkris commented 1 year ago

Hi, thanks for your interesting work. I have a query about loss function in contrastive.py (line 65-66). The denominator of contrastiveMatrix formula in line 66 will always be one due to the softmax operation in line 65. Is that OK given the Supervised Contrastive loss implementation in https://github.com/HobbitLong/SupContrast? Why softmax is applied?

65 - expScores = torch.softmax(confident_sim_matrix/self.tau, dim=0) 66 - contrastiveMatrix = (expScores * mask_sim).sum(0) / (expScores.sum(0))

tarun005 commented 1 year ago

We observed more gradient stability when we divide with the sum, although, as you rightly noted, it is always 1 so the ratio does not change. Does that answer your question?