Closed YanqingWu closed 4 years ago
@YanqingWu without full case description it is useless issue.
Would you please share code to illustrate the problem you experience?
пн, 24 февр. 2020 г. в 2:56 PM, Yanqing Wu notifications@github.com:
Loss.SoftCrossEntropyLoss() not work.
— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/BloodAxe/pytorch-toolbelt/issues/38?email_source=notifications&email_token=AAEB6YF63YVQHOYEGYLG2NDREO7XDA5CNFSM4K2HVJU2YY3PNVWWK3TUL52HS4DFUVEXG43VMWVGG33NNVSW45C7NFSM4IPXBMJA, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAEB6YDHJQKHENPKYHZ3PLTREO7XDANCNFSM4K2HVJUQ .
@BloodAxe @Konstantin Maksimov
import torch import pytorch_toolbelt.losses as loss sfce = loss.SoftCrossEntropyLoss() outputs = torch.rand(10, 2).float() targets = torch.randint(0, 2, (10, 1)).view(-1).long() targets2 = torch.randint(0, 2, (10, 1)).long() print('test 1: ', sfce(outputs, targets)) print('test 2: ', sfce(outputs, targets2))
two tests I get the same error: It should not be a dimension not match or some other common problem. May be I have the wrong torch version? print(torch.version) I get : 1.3.1
Traceback (most recent call last):
File "/usr/local/anaconda3/lib/python3.7/site-packages/IPython/core/interactiveshell.py", line 3326, in run_code
exec(code_obj, self.user_global_ns, self.user_ns)
File "
Thanks for the reproduction code. As for now you can use quick workaround loss.SoftCrossEntropyLoss(ignore_index=-100)
, Correct default value will be fixed in 0.3.1 release.
Loss.SoftCrossEntropyLoss() not work.