Now, activation function is a part of FullyConnectedLayer. However, in many neural network models such as AlexNet and GoogleNet, activation function ReLU is used for the outputs of convolutional layers and fully connected layers.
We can add the activation function feature to other layers. Nonetheless, in order to reduce duplicates and be more general, it is better to have separate layers for activation functions like Caffe and Apache SINGA.
Now, activation function is a part of
FullyConnectedLayer
. However, in many neural network models such as AlexNet and GoogleNet, activation functionReLU
is used for the outputs of convolutional layers and fully connected layers. We can add the activation function feature to other layers. Nonetheless, in order to reduce duplicates and be more general, it is better to have separate layers for activation functions like Caffe and Apache SINGA.