ynu-yangpeng / GLMC

[CVPR2023] Global and Local Mixture Consistency Cumulative Learning for Long-tailed Visual Recognitions
67 stars 12 forks source link

Why value of loss is negative #2

Open ug-kim opened 1 year ago

ug-kim commented 1 year ago

Hello, I have some questions about your work.

I tried to train with CIFAR10-LT and CIFAR100-LT but have negative loss values.

Why loss value is negative?

Thank you for sharing your awesome work!

ynu-yangpeng commented 1 year ago

we used contrastive learning, which may result in negative similarity values when calculating the similarity between two images. Additionally, we multiplied the loss of contrastive learning by 10, which can lead to negative loss values. If you need more information, you can debug our code.

ug-kim commented 1 year ago

I understand your answer. Thank you for your fast reply I will try to debug your code.

ug-kim commented 1 year ago

I think you could try CosineEmbeddingLoss if you want to positive loss value.

https://pytorch.org/docs/stable/generated/torch.nn.CosineEmbeddingLoss.html