Closed Hatsunespica closed 6 years ago
I found there is a details in the paper, that is "• ReLU (Nair & Hinton, 2010) is applied to activations from squeeze and expand layers." But I can't find any use of ReLU in squeezenet.py. Are there some reason for without ReLU?
It actually is using relu, it's just not explicit. It's the default activation function for the tf.contrib.layers.conv2d.
https://www.tensorflow.org/api_docs/python/tf/contrib/layers/conv2d
I got it, thanks!
I found there is a details in the paper, that is "• ReLU (Nair & Hinton, 2010) is applied to activations from squeeze and expand layers." But I can't find any use of ReLU in squeezenet.py. Are there some reason for without ReLU?