hunkim / PyTorchZeroToAll

Simple PyTorch Tutorials Zero to ALL!
http://bit.ly/PyTorchZeroAll
3.89k stars 1.2k forks source link

Question about slide 09 Softmax Classifier #14

Closed alphadl closed 6 years ago

alphadl commented 6 years ago

As you mentioned in 28th line "09_01_softmax_loss.py"

27 # Input is class, not one-hot 28 Y = Variable(torch.LongTensor([0]), requires_grad=False)

however, whichever category I choose not just class 0 (the first dimension) , Loss1 and Loss2 are always stay the same.

for example: I choose class 1 (the second dimension), the losses are still 0.41 and 1.84

Y=Variable(torch.LongTensor([1]),requires_grad=False)

PyTorch Loss1 = 0.41703000664711 PyTorch Loss2 = 1.840616226196289

alphadl commented 6 years ago

sorry, I ran this code again and found that it was working well, different classes get different losses.