Closed sayakpaul closed 2 years ago
They do not.
Running colabs mentioned in https://blog.tensorflow.org/2020/12/making-bert-easier-with-preprocessing-models-from-tensorflow-hub.html on CPU, GPU and TPU (for advanced example) works. The issue might have been resolved. If you still see the issue, please let us know.
There seems to be a caching issue for the BERT models (including the preprocessors and the encoders). I am referring to the BERT models mentioned in this blog post: https://blog.tensorflow.org/2020/12/making-bert-easier-with-preprocessing-models-from-tensorflow-hub.html.
For reproducing this issue, one can run code from any of the tutorials mentioned in the above blog post.
The error is as follows:
The issue seems to go after rerunning the code from scratch. Is there a better workaround?