kcct-fujimotolab / 3DCNN

3D convolutional neural network for video classification
270 stars 107 forks source link

Why is 'softmax' activation used in convolutional layers? #6

Closed sheelabhadra closed 5 years ago

sheelabhadra commented 6 years ago

In lines 136 and 143 of the file '3dcnn.py', 'softmax' activation is used in the convolutional layers instead of the commonly used 'relu' or 'elu'. Is it an alternative to batch normalization or is there any other specific reason for using 'softmax' after convolution layers?