Open fernandoblalves opened 4 years ago
Ah .. was just thinking about posting on this exact same thing :-)
Would like to load a locally saved tf_hub model, e.g., USE (https://tfhub.dev/google/universal-sentence-encoder/4). Searched for quite some time in the api, but couldn't find anything. Is this possible?
Something to the effect of:
>>> model = tf.saved_model.load(r"/path/to/use_4")
It seems like the saved_model class and related utilities are available only in python and java. May be this is written as an utility in those languages. I couldn't find a reference to this in C++ API. Considering that this library is a wrapper on C++, it is somewhat unlikely to have this functionality. It is probably still possible to write such an utility in this library as well, maintaining the same general classnames etc.
Here are 2 implementations in Scala and Java through the Java library:
Waiting for @eaplatanios to respond if possible.
Is it possible to load in to tensorflow_scala models generated in python "main" tensorflow? I can't find anywhere who to do it.
Thanks