Alibaba-MIIL / ASL

Official Pytorch Implementation of: "Asymmetric Loss For Multi-Label Classification"(ICCV, 2021) paper
MIT License
732 stars 102 forks source link

add eps to probs before applying log to avoid infs #6

Closed michalwols closed 4 years ago

michalwols commented 4 years ago

Without it large logits lead to infs and nans due to torch.log(0)

before:

asl = AsymmetricLoss()
asl(torch.Tensor([[-100]]), torch.Tensor([[0]]))
>> tensor(nan)

after:

asl = AsymmetricLoss(eps=0)
asl(torch.Tensor([[-100]]), torch.Tensor([[0]]))
>> tensor(nan)

asl = AsymmetricLoss()
asl(torch.Tensor([[-100]]), torch.Tensor([[0]]))
>> tensor(-0.)
mrT23 commented 4 years ago

Thanks @michalwols we didnt encounter these instabilities in our training, but indeed it is important protection.