snuspl / dolphin

14 stars 2 forks source link

Separate activation function from fully connected layer #144

Closed beomyeol closed 8 years ago

beomyeol commented 8 years ago

Now, activation function is a part of FullyConnectedLayer. However, in many neural network models such as AlexNet and GoogleNet, activation function ReLU is used for the outputs of convolutional layers and fully connected layers. We can add the activation function feature to other layers. Nonetheless, in order to reduce duplicates and be more general, it is better to have separate layers for activation functions like Caffe and Apache SINGA.