BloodAxe / pytorch-toolbelt

PyTorch extensions for fast R&D prototyping and Kaggle farming
MIT License
1.52k stars 122 forks source link

SoftCrossEntropyLoss error #45

Closed somebodyus closed 4 years ago

somebodyus commented 4 years ago

When I use the SoftCrossEntropyLoss, I got the error:

RuntimeError: one of the variables needed for gradient computation has been modified by an inplace operation

Could anyone help me? BTW, what paper proposed the SoftCrossEntropyLoss?

BloodAxe commented 4 years ago

Hi! Thanks for raising this issue. Indeed, builtin pytorch function had some in-place operations, which were causing this error. It should be fixed in master since https://github.com/BloodAxe/pytorch-toolbelt/commit/68945c9421be55de7211f59c53a0965c1fde0443

somebodyus commented 4 years ago

Thank you for your update. What does the "soft" mean? Could you give a technical reference for this loss?