Closed hyBlue closed 4 years ago
I'm an idiot
@hyBlue seems you found out the answer yourself. Could you explain what NCESoftmaxLoss
is trying to do here? I am having the same doubt as you initially had (and feeling like an idiot).
Explanation: label = torch.zeros([bsz]).cuda().long()
means the labels are [1, 0, 0, 0, ...]
for each sample. label
specifies the label of the positive sample, which is 0
in each case.
It's funny how easy it is to get confused even with something you think you are fairly familiar with.
@vinsis Great! You're right
Thanks a lot! (A new idiot)
triple idiot.
https://github.com/HobbitLong/CMC/blob/58d06e9a82f7fea2e4af0a251726e9c6bf67c7c9/NCE/NCECriterion.py#L35-L46
Hi, I have a question about using softmax instead of NCE loss. In that function, every label is set zero including the critic value of positive sample, which has index 0 of the batch. I want to know the reason. My take on this is that the label should be [1, 0, 0, 0, ...]. Isn't it?