guillaumegenthial / tf_ner

Simple and Efficient Tensorflow implementations of NER models with tf.estimator and tf.data
Apache License 2.0
923 stars 275 forks source link

What should be the `serving_input_receiver_fn` if we want to use `estimator.export_savedmodel`? #19

Closed gamerrishad closed 6 years ago

gamerrishad commented 6 years ago

If we really want to use export_savedmodel(serving_input_receiver_fn=?, export_dir_base=?) using custom estimator, (Ignoring the checkpoints) for the production, what it should be? Here's an example of my scripts:

UPDATE:

2

def serving_input_fn(hyperparameters=None):
    feature_spec = {
        'foo': tf.placeholder(dtype=tf.string, shape=[None, None]),
        'bar': tf.placeholder(tf.int32, [None]),
    }
    return tf.estimator.export.ServingInputReceiver(receiver_tensors=feature_spec,
                                                    features=feature_spec)

And adding more line in model_fn for converting from dict to tensor. This solution saves the model using estimator.export_savedmodel(export_dir_base='./Result/', serving_input_receiver_fn=serving_input_fn()).