kaidic / LDAM-DRW

[NeurIPS 2019] Learning Imbalanced Datasets with Label-Distribution-Aware Margin Loss
https://arxiv.org/pdf/1906.07413.pdf
MIT License
634 stars 115 forks source link

Focal loss would lead to nan? #17

Closed liming-ai closed 3 years ago

liming-ai commented 3 years ago

Hi @kaidic

Thanks for your fantastic work, but when I tried to reproduce the focal loss result, I found that when gamma=0.5, the focal loss would lead to nan loss during training, but the focal loss in this repo can make it.

I checked the two different designed focal loss carefully and found the forward progress of them are the same but model parameters became different after backward, I am quite confused, could you please give me some advice?

Thanks for your contribution again!

liming-ai commented 3 years ago

The output loss calculated by F.cross_entropy_loss could be negative, lead to nan loss. We can add a small value or use F.relu() to deal with this issue. This problem showed in my device with PyTorch==1.8.0, hope it is helpful

image