In lines 136 and 143 of the file '3dcnn.py', 'softmax' activation is used in the convolutional layers instead of the commonly used 'relu' or 'elu'. Is it an alternative to batch normalization or is there any other specific reason for using 'softmax' after convolution layers?
In lines 136 and 143 of the file '3dcnn.py', 'softmax' activation is used in the convolutional layers instead of the commonly used 'relu' or 'elu'. Is it an alternative to batch normalization or is there any other specific reason for using 'softmax' after convolution layers?