Closed farzadsaffari closed 1 year ago
hello, thanks for your intersest! The use of torch.nn.CrossEntropyLoss said this case is equivalent to the combination of LogSoftmax and NLLLoss. So we use this function in the code.
Ok yes, that work as well. Thanks.
Hi
Thanks for sharing your implementation. Really appreciate it. Even though it is stated that a SoftMax activation function has been sued in the classification part, in your implementation you have not used this function in the last layer of your model. could you clarify whether there is a reason or if I'm just missing something? Thanks.