Closed MrMwalker2 closed 2 years ago
@pindinagesh
Thanks for your reply. The later one is exactly what I was trying and it generates the error mentioned. The workaround isn't exactly what I'm looking for.
Unfortunately, at this time we cannot provide more guidance than what was mentioned in https://github.com/tensorflow/hub/issues/845#issuecomment-1067822239. Feel free to re-open it if there are more insights or if you've found a workaround that works for you.
What happened?
Hi there,
I'm trying to retrain the pre-trained BERT model from TF Hub. I'm using the following preprocessor: https://tfhub.dev/tensorflow/bert_multi_cased_preprocess/3 along with the multilingual encoder https://tfhub.dev/tensorflow/bert_multi_cased_L-12_H-768_A-12/4'
I'm running tf 2.5.0 and the latest version of tf hub.
When I'm running the code as shown here under basic usage
I can successfully save the model as .h5 to use for serving.
However, when I try to increase the
seq_length
like this:I'm no longer able to save the model. How can I increase the
seq_length
and save to model as .h5?I also tried to call model.save('model'), which saves it as a pb file to load it again and save as .h5, but I've faced another issue while loading the saved model.
Relevant code
Relevant log output
tensorflow_hub Version
0.12.0 (latest stable release)
TensorFlow Version
other (please specify)
Other libraries
No response
Python Version
3.x
OS
Linux