Closed anatoly-khomenko closed 4 years ago
I have pushed some changes to add a mtf_model.export call. However, I believe I will need to update both the mesh and t5 packages to get everything working for you. I'll try to test this today to and update the packages.
@adarob
Thanks for the great work!
I was using the t5 version 0.2.0 , and I tried to deploy an exported sample model to GCP ML engine (single core cpu) but with no luck. The error was
Create Version failed. Model validation failed: SavedModel must contain exactly one metagraph with tag: serve For more information on how to export Tensorflow SavedModel, see https://www.tensorflow.org/api_docs/python/tf/saved_model.
I used saved_model_cli to inspect the exported file, and I guess the problem was due to the second 'serve' tag for tpu.
Another potential problem is that ML engine seems to require outer dimension to be unknown to enable batching , while the t5 batch size is set to a fixed integer.
Last but not the least, the savedModel works great on local machine, but it broke after I tried to quantize it by following the instructions on link.
So, my questions are:
Your help will be highly appreciated. Regards.
Thanks for sharing these details. Unfortunately as researchers, we don't have time to test all of these applications so it's very helpful for users like yourself to both report and help us fix bugs like this.
Adding @toponado who may be able to help with the tpu tag issue. I wonder if there is a way to remove the tag after export?
I actually don't believe the dimensionality is an issue. The input placeholder is actually shaped [None]
and the batch is automatically padded as long as it is no bigger than what you set as batch_size
during export. We can come back and address this if it turns out to be a problem.
I just took a quick look and it looks like we just need to disable export_to_tpu
in the TPUEstimator
constructor: https://github.com/tensorflow/estimator/blob/08dd1e6ca94248691dfe00aadafc65e8b875e44f/tensorflow_estimator/python/estimator/tpu/tpu_estimator.py#L2665
However, this appears to be already happening here: https://github.com/tensorflow/mesh/blob/4c656e2f249bcacbda684c4ff0a860b86f372a2c/mesh_tensorflow/transformer/utils.py#L1082
I'm not sure why it is still being exported with TPU...
Ah, it's because we call export_estimator_savedmodel
(https://github.com/tensorflow/estimator/blob/08dd1e6ca94248691dfe00aadafc65e8b875e44f/tensorflow_estimator/python/estimator/tpu/tpu_estimator.py#L4219), which creates a new TPUEstimator with export_to_tpu
set to True
by default. Should be fairly easy to fix. I'll check it out tomorrow.
I have fine-tuned the 3B model in Colab notebook provided in this repo as notebooks/t5-trivia.ipynb. After fine-tuning as recommended in this notebook, I would like to export my model as SavedModel to be served by ML Cloud.
To do this I use the following code fragment placed as the last cell in the notebook (so I have model already created):
I get the following error when executing this code:
Complete stack-trace is here: stack-trace-SavedModel.txt
I assume that the error tells that this model is only suitable for TPU inference and is not supposed to work on ML Cloud (where you have only CPU instances available).
Is this a feasible task to make this model served by ML Cloud and if so what are the steps I should follow to accomplish that?
Thank you!