tensorflow / hub

A library for transfer learning by reusing parts of TensorFlow models.
https://tensorflow.org/hub
Apache License 2.0
3.49k stars 1.67k forks source link

Enable 'serve' tag-set in create_module_spec function call #19

Closed samsends closed 5 years ago

samsends commented 6 years ago

It would be useful to export modules in a format that can be consumed by TensorFlow Serving. Servables would increase code reuse and further enable distributed workloads.

Example Usage (note passing "serve" instead of "train"):

hub.create_module_spec(
    module_fn,
    tags_and_args=[({"serve"}, {"is_training":False})],
    drop_collections=None)
navneetrao commented 6 years ago

+1 Making hub modules consumable by Tensorflow Serving would be very helpful.

warynice commented 6 years ago

Any workarounds for this currently? I want to serve a hub module using Tensorflow Serving.

samsends commented 6 years ago

I have not tested this, but I that suspect you could load the module into an empty graph and then export with savedmodelbuilder. We could build an automated tool.

andresusanopinto commented 6 years ago

For now one has to do it manually, example:

import tensorflow as tf
import tensorflow_hub as hub

with tf.Graph().as_default():
  module = hub.Module("http://tfhub.dev/google/universal-sentence-encoder/2")
  text = tf.placeholder(tf.string, [None])
  embedding = module(text)

  init_op = tf.group([tf.global_variables_initializer(), tf.tables_initializer()])
  with tf.Session() as session:
    session.run(init_op)
    tf.saved_model.simple_save(
        session,
        "/tmp/serving_saved_model",
        inputs = {"text": text},
        outputs = {"embedding": embedding}        
    )

Each modules differs slightly from others in input/output names, additionally each serving use case might have different requirements (e.g. raw features in vs serialized tf.Example protos). By having users creating the graph they want to serve (e.g. as done above) seems more flexible than require users to guess what Servo config to use and/or modify the client side each time they change the module being served.

andresusanopinto commented 5 years ago

Closing as this is now obsolete.

In TF-2 users should create reusable saved models with tf.saved_model.save().

chenliu0831 commented 4 years ago

@andresusanopinto quick question, does it mean the pre-trained model on tensorflow hub won't have the serving_default signature by default and the user need to re-export it? Thanks

arnoegw commented 4 years ago

A TF2 SavedModel can both have signatures for deployment to TensorFlow Serving and tf.functions for reuse in a Python TensorFlow program.

See https://www.tensorflow.org/hub/tf2_saved_model#advanced_topic_what_to_expect_from_the_savedmodel_after_loading and https://www.tensorflow.org/guide/saved_model

chenliu0831 commented 4 years ago

@arnoegw thanks for the pointer. I noticed some inconsistency of some TF2 models that some have serving_default signature while some does not. For example:

My understanding is if serving_default is not present, the model cannot be served as-is in TF serving. Should all TF2 models have this signature? Let me know if I should open a new issue to track this.

arnoegw commented 4 years ago

@chenliu0831, there's nothing wrong with those examples:

chenliu0831 commented 4 years ago

@arnoegw Ah thanks for clarifying. I give a bad example for the second case since that's a feature vector.

I spot check another few on TF hub with the TF2 filter and with classification variants and looks like their signature map is empty as well:

In the detail page of above models I think they all show up as "TF2.0 Saved Model" format.