Open etragas-fathom opened 5 years ago
hi, I also met this problem, and another problem is that when i set xla_jit_level = 1, it becomes much slower, about 5 times slower than the baseline. But according the tensorflow official document, it seems that set xla_jit_level=1 can speed up training.
Background
I saw no speed up after running some experiments passing in the xla_compile flag as True.
Digging deeper, it seems the flag does nothing.
The flag is defined here: https://github.com/tensorflow/tensor2tensor/blob/master/tensor2tensor/bin/t2t_trainer.py#L57
And is used uniquely here: https://github.com/tensorflow/tensor2tensor/blob/28adf2690c551ef0f570d41bef2019d9c502ec7e/tensor2tensor/bin/t2t_trainer.py#L196
That takes us to create_experiment_fn which just calls create_experiment under the hood.
In create_experiment we see use_xla being passed to create_estimator here but then create_estimator does nothing with use_xla, and instead deletes it