A slim tensorflow wrapper that provides syntactic sugar for tensor variables. This library will be helpful for practical deep learning researchers not beginners.
Have you looked into using fused_batch_norm? As I understand it is identical to batch_normalization but faster. However, I don't understand why there would be two operators for the same thing.
Have you looked into using
fused_batch_norm
? As I understand it is identical tobatch_normalization
but faster. However, I don't understand why there would be two operators for the same thing._
tf.contrib.layers.batch_norm(fused=True)
appears to be using it._