Open cbockman opened 6 years ago
Batch normalization gets applied here:
https://github.com/tensorflow/tensor2tensor/blob/master/tensor2tensor/layers/common_layers.py#L632
Per https://www.tensorflow.org/versions/master/api_docs/python/tf/layers/batch_normalization, training=True needs to be set while training (unless something really funky is happening upstream of apply_norm?)
Is a relatively easy fix; happy to issue PR, assuming I am not misunderstanding.
t2t master
Would welcome a PR adding a is_training kwarg to apply_norm and updates around the codebase.
is_training
apply_norm
Description
Batch normalization gets applied here:
https://github.com/tensorflow/tensor2tensor/blob/master/tensor2tensor/layers/common_layers.py#L632
Per https://www.tensorflow.org/versions/master/api_docs/python/tf/layers/batch_normalization, training=True needs to be set while training (unless something really funky is happening upstream of apply_norm?)
Is a relatively easy fix; happy to issue PR, assuming I am not misunderstanding.
TensorFlow and tensor2tensor versions
t2t master