Closed zhaohui-yang closed 5 years ago
Line 86, log softmax Line 94, Cross Entropy Loss
In the mnist example, you combine Cross Entropy Loss with log softmax, why not using NLLLoss + logsoftmax?
Yes, you are right. It should be either only corss_entropy or NLL+logsoftmax.
Line 86, log softmax Line 94, Cross Entropy Loss
In the mnist example, you combine Cross Entropy Loss with log softmax, why not using NLLLoss + logsoftmax?