vonclites / squeezenet

Tensorflow implementation of SqueezeNet.
MIT License
129 stars 63 forks source link

No Relu Layer in squeezenet.py ? #4

Closed Hatsunespica closed 6 years ago

Hatsunespica commented 6 years ago

I found there is a details in the paper, that is "• ReLU (Nair & Hinton, 2010) is applied to activations from squeeze and expand layers." But I can't find any use of ReLU in squeezenet.py. Are there some reason for without ReLU?

vonclites commented 6 years ago

It actually is using relu, it's just not explicit. It's the default activation function for the tf.contrib.layers.conv2d.

https://www.tensorflow.org/api_docs/python/tf/contrib/layers/conv2d

Hatsunespica commented 6 years ago

I got it, thanks!