Alibaba-MIIL / ASL

Official Pytorch Implementation of: "Asymmetric Loss For Multi-Label Classification"(ICCV, 2021) paper
MIT License
732 stars 102 forks source link

Very high value of loss #60

Closed talhaanwarch closed 3 years ago

talhaanwarch commented 3 years ago

I am getting very high loss valus. I have sigmoid in my last layer, so i removed it from here https://github.com/Alibaba-MIIL/ASL/blob/64890f31877892ab0bdf1ae86eaf98e60001dd50/src/loss_functions/losses.py#L81

Loss is around 100+, when i used loss_function=AsymmetricLoss(gamma_neg=0, gamma_pos=0, clip=0) loss_function=AsymmetricLoss(gamma_neg=2, gamma_pos=2, clip=0) loss_function=AsymmetricLoss(gamma_neg=2, gamma_pos=1, clip=0)

With torch BCELoss, it ranges from 0.4-0.8.

mrT23 commented 3 years ago

you need to provide dramatically more details.

give example code with BCELoss, and with ASL loss. check dimension and input types. 99.99% its a bug in your implementation.

thunanguyen commented 3 years ago

I am getting very high loss valus. I have sigmoid in my last layer, so i removed it from here https://github.com/Alibaba-MIIL/ASL/blob/64890f31877892ab0bdf1ae86eaf98e60001dd50/src/loss_functions/losses.py#L81

Loss is around 100+, when i used loss_function=AsymmetricLoss(gamma_neg=0, gamma_pos=0, clip=0) loss_function=AsymmetricLoss(gamma_neg=2, gamma_pos=2, clip=0) loss_function=AsymmetricLoss(gamma_neg=2, gamma_pos=1, clip=0)

With torch BCELoss, it ranges from 0.4-0.8.

Do you use large batch size? If yes then high ASL loss is normal because it return the sum not the mean like other default losses of Pytorch like BCE. If no then it is likely a bug in your implementation