hi, I have some question, about contrast_logits=2+contrast_1. What is the meaning of 2?
contranst_1= -contrast_1 torch.zeros(bsz, bsz).filldiagonal(1).type_as(outputs) + (
(1 - contrast_1).log()) torch.ones(bsz, bsz).filldiagonal(0).type_as(outputs)
means the similarity of <pi, pi> and (1-sim).log() for <pi, pj> or <pj, pi>.
but 2+contrast_1 cannot make the contrast_logits be positive (<pi, pi> is 0~1, but (1-sim).log() is (-inf, 0)).
Hi, "2" here follows the BYOL loss (Eq.2). If you are using SimSiam loss (Eq.1), then there is no "2". Basically, the loss functions are set to be consistent with previous work. We don't intend to make contrast_logits positive.
https://github.com/liyi01827/noisy-contrastive/blob/03f48ca3028f4a6fc54b6faaf05ad811b4055381/main.py#L216
hi, I have some question, about contrast_logits=2+contrast_1. What is the meaning of 2? contranst_1= -contrast_1 torch.zeros(bsz, bsz).filldiagonal(1).type_as(outputs) + ( (1 - contrast_1).log()) torch.ones(bsz, bsz).filldiagonal(0).type_as(outputs) means the similarity of <pi, pi> and (1-sim).log() for <pi, pj> or <pj, pi>. but 2+contrast_1 cannot make the contrast_logits be positive (<pi, pi> is 0~1, but (1-sim).log() is (-inf, 0)).