Open shengyehchen opened 6 years ago
torch.nn.CrossEntropyLoss() combines LogSoftMax and NLLLoss in one single class. Thus, there should be only "return out" instead of return _"F.logsoftmax(out)" in the forward() function of some classification models.
torch.nn.CrossEntropyLoss() combines LogSoftMax and NLLLoss in one single class. Thus, there should be only "return out" instead of return _"F.logsoftmax(out)" in the forward() function of some classification models.