Open zachary-jablons-okcupid opened 2 years ago
Actually, as I was reimplementing it I came across this line in the PyTorch docs for CrossEntropyLoss
:
The input is expected to contain raw, unnormalized scores for each class.
I'm guessing this is my answer - this behavior is equivalent to using from_logits=True
in categorical_crossentropy
in TensorFlow. Is that where the softmax is effectively occurring?
Yes, you are correct. The wrapper class does not apply softmax to the output.
hi I am wondering the same thing. just to double check, I am wondering where you ended up adding the softmax in the code?
Hey Geoff,
I know this is 5 year old research code, but I'm a bit confused about something. In the accompanying paper, it seems like the output of temperature scaling is meant to go through a softmax before being used.
However, in this implementation as far as I can tell there's no use of Softmax as part of the temperature scaling operation. I'd expect to see it maybe at the forward step or potentially when the output thereof is put into the cross entropy loss here, but it seems like instead the cross entropy is being given the scaled logits without any softmax applied.
I might just be missing something obvious here of course, but I want to make sure my understanding of how temperature scaling is supposed to work is correct.
Thanks in advance for helping me clarify anything I'm missing here