Closed ArdaEfeOkay closed 6 years ago
Perhaps it should be more explicit, especially since Google seems to sometimes change the defaults in TF, but it actually does include those relus -- all the contrib layers include tf.nn.relu
activation by default
tf.contrib.layers.fully_connected(
inputs,
num_outputs,
activation_fn=tf.nn.relu,
normalizer_fn=None,
normalizer_params=None,
weights_initializer=initializers.xavier_initializer(),
weights_regularizer=None,
biases_initializer=tf.zeros_initializer(),
biases_regularizer=None,
reuse=None,
variables_collections=None,
outputs_collections=None,
trainable=True,
scope=None
)
Thanks!
While reading the "Age and Gender Classification using Convolutional Neural Networks", I noticed that relu6 and relu7 are missing on your implementation. Is there any reason to don't use them?