eeyhsong / EEG-Conformer

EEG Transformer 2.0. i. Convolutional Transformer for EEG Decoding. ii. Novel visualization - Class Activation Topography.
GNU General Public License v3.0
421 stars 59 forks source link

Missing SoftMax function #10

Closed farzadsaffari closed 1 year ago

farzadsaffari commented 1 year ago

Hi

Thanks for sharing your implementation. Really appreciate it. Even though it is stated that a SoftMax activation function has been sued in the classification part, in your implementation you have not used this function in the last layer of your model. could you clarify whether there is a reason or if I'm just missing something? Thanks.

eeyhsong commented 1 year ago

hello, thanks for your intersest! The use of torch.nn.CrossEntropyLoss said this case is equivalent to the combination of LogSoftmax and NLLLoss. So we use this function in the code.

farzadsaffari commented 1 year ago

Ok yes, that work as well. Thanks.