Closed samsends closed 5 years ago
+1 Making hub modules consumable by Tensorflow Serving would be very helpful.
Any workarounds for this currently? I want to serve a hub module using Tensorflow Serving.
I have not tested this, but I that suspect you could load the module into an empty graph and then export with savedmodelbuilder
. We could build an automated tool.
For now one has to do it manually, example:
import tensorflow as tf
import tensorflow_hub as hub
with tf.Graph().as_default():
module = hub.Module("http://tfhub.dev/google/universal-sentence-encoder/2")
text = tf.placeholder(tf.string, [None])
embedding = module(text)
init_op = tf.group([tf.global_variables_initializer(), tf.tables_initializer()])
with tf.Session() as session:
session.run(init_op)
tf.saved_model.simple_save(
session,
"/tmp/serving_saved_model",
inputs = {"text": text},
outputs = {"embedding": embedding}
)
Each modules differs slightly from others in input/output names, additionally each serving use case might have different requirements (e.g. raw features in vs serialized tf.Example protos). By having users creating the graph they want to serve (e.g. as done above) seems more flexible than require users to guess what Servo config to use and/or modify the client side each time they change the module being served.
Closing as this is now obsolete.
In TF-2 users should create reusable saved models with tf.saved_model.save().
@andresusanopinto quick question, does it mean the pre-trained model on tensorflow hub won't have the serving_default
signature by default and the user need to re-export it? Thanks
A TF2 SavedModel can both have signatures for deployment to TensorFlow Serving and tf.functions for reuse in a Python TensorFlow program.
See https://www.tensorflow.org/hub/tf2_saved_model#advanced_topic_what_to_expect_from_the_savedmodel_after_loading and https://www.tensorflow.org/guide/saved_model
@arnoegw thanks for the pointer. I noticed some inconsistency of some TF2 models that some have serving_default
signature while some does not. For example:
default
signature:
serving_default
https://tfhub.dev/tensorflow/resnet_50/classification/1My understanding is if serving_default
is not present, the model cannot be served as-is in TF serving. Should all TF2 models have this signature? Let me know if I should open a new issue to track this.
@chenliu0831, there's nothing wrong with those examples:
@tf.function __call__
but not SignatureDefs:
https://tfhub.dev/google/imagenet/mobilenet_v2_100_224/feature_vector/4@tf.function __call__
and a serving signature:
https://tfhub.dev/tensorflow/resnet_50/classification/1@arnoegw Ah thanks for clarifying. I give a bad example for the second case since that's a feature vector.
I spot check another few on TF hub with the TF2 filter and with classification
variants and looks like their signature map is empty as well:
In the detail page of above models I think they all show up as "TF2.0 Saved Model" format.
It would be useful to export modules in a format that can be consumed by TensorFlow Serving. Servables would increase code reuse and further enable distributed workloads.
Example Usage (note passing "serve" instead of "train"):