Closed talhaanwarch closed 3 years ago
you need to provide dramatically more details.
give example code with BCELoss, and with ASL loss. check dimension and input types. 99.99% its a bug in your implementation.
I am getting very high loss valus. I have sigmoid in my last layer, so i removed it from here https://github.com/Alibaba-MIIL/ASL/blob/64890f31877892ab0bdf1ae86eaf98e60001dd50/src/loss_functions/losses.py#L81
Loss is around 100+, when i used
loss_function=AsymmetricLoss(gamma_neg=0, gamma_pos=0, clip=0)
loss_function=AsymmetricLoss(gamma_neg=2, gamma_pos=2, clip=0)
loss_function=AsymmetricLoss(gamma_neg=2, gamma_pos=1, clip=0)
With torch
BCELoss
, it ranges from0.4-0.8
.
Do you use large batch size? If yes then high ASL loss is normal because it return the sum not the mean like other default losses of Pytorch like BCE. If no then it is likely a bug in your implementation
I am getting very high loss valus. I have sigmoid in my last layer, so i removed it from here https://github.com/Alibaba-MIIL/ASL/blob/64890f31877892ab0bdf1ae86eaf98e60001dd50/src/loss_functions/losses.py#L81
Loss is around 100+, when i used
loss_function=AsymmetricLoss(gamma_neg=0, gamma_pos=0, clip=0)
loss_function=AsymmetricLoss(gamma_neg=2, gamma_pos=2, clip=0)
loss_function=AsymmetricLoss(gamma_neg=2, gamma_pos=1, clip=0)
With torch
BCELoss
, it ranges from0.4-0.8
.