Open wwma opened 2 years ago
Hi @wwma,
According to my understanding, loss = - torch.log(pos_sim / neg_sim), thus positive might indicate pos_sim / neg_sim > 1. It might happen when batch_size is small or augmentation is weak, or T factor is set small (e.g. <= 0.1). It looks fine to me. A simple way to turn loss positive is directly increase batch_size or lower the T factor in CL loss.
hello,@yyou1996, thanks for your great work! I have a question about the GraphCL in transferLearning_MoleculeNet_PPI/chem.When I run the code,the loss of each epoch is negative number. I wonder if I did something wrong,or that's what it is. thanks!