tensorflow / nmt

TensorFlow Neural Machine Translation Tutorial
Apache License 2.0
6.37k stars 1.96k forks source link

Create .pb/.pbtxt for serving #294

Closed ptamas88 closed 6 years ago

ptamas88 commented 6 years ago

Hello everyone! I have a fully trained model, and working inference with the following command: _python -m nmt.nmt --out_dir=nmt/tmp/nmt_model --inference_input_file=nmt/tmp/viet_test.txt --inference_output_file=nmt/tmp/viet_testout.txt After this, I would like to give this model to Tensorflow serving, which i tried with the MNIST example (worked). I have the following files after training: image

The following serving command requires a .pb or .pbtxt file named saved_model: _saved_modelcli show --dir .../Data/1

It should return the tags found in the frozen model: e.g.: serve.

Can anyone help me, how to get the working .pb/.pbtxt file after training? I have seen the previous issues mentioning this subject, but haven't really seen any solution yet. Thanks

ptamas88 commented 6 years ago

The following code finally solved my 5day long struggle: https://github.com/tensorflow/serving/issues/712

Next step: a working client py file

mohammedayub44 commented 5 years ago

@ptamas88 I'm running into the same problem not for TensorFlow serving but using it in Keras. I need to generate the pb or pbtxt file from the ckpt files. Could you briefly explain the steps you took to generate the .pb or .pbtxt file. I was lost when it said you need to create signature etc. If you could share the code that would be awesome too.

Appreciate any help !

-Mohammed Ayub

ptamas88 commented 5 years ago

Finally I found out that the weights have not been saved into the pb file. You have to define the tensors you would like to include in the frozen model. I haven't tried it in keras but it seems easier that way since you can print out summary of the model. On the other side i dont know how to define signature in keras. I made it with native TF code in a classification keras project. I will copy the code here sometime in the weekend.

mohammedayub44 commented 5 years ago

@ptamas88 Thanks for your prompt reply. I wan to try the easiest option first, which is not run the model again (because of budget cost & time). I saw this on SO to generate the pb file from just the ckpt meta files, not sure if this will work. I will try this sometime today. https://stackoverflow.com/questions/45864363/tensorflow-how-to-convert-meta-data-and-index-model-files-into-one-graph-pb

Thanks for sharing your code, it will save me weeks of pain.

-Mohammed Ayub

mohammedayub44 commented 5 years ago

@ptamas88 Just as update, with the SO link I could get generate .pb file. But still doesn't solve my original problem of loading this into Keras.

ptamas88 commented 5 years ago

@mohammedayub44 here is the code that i ran after a keras fit cycle. Are you trying to load the frozen model back to keras? Because i think it's not possible from a pb file.

# serving imports
import tensorflow as tf
from keras import backend as K
from tensorflow.python.saved_model import builder as saved_model_builder
from tensorflow.python.saved_model import tag_constants, signature_constants, signature_def_utils_impl
sess = tf.Session()
K.set_session(sess)
K.set_learning_phase(0)

...

# saving model for serving

x = model.input
y = model.output

classification_signature = tf.saved_model.signature_def_utils.predict_signature_def({"inputs": x}, {"prediction": y})

valid_classificaiton_signature = tf.saved_model.signature_def_utils.is_valid_signature(classification_signature)
if valid_classificaiton_signature == False:
    raise ValueError("Error: Prediction signature not valid!")

model_version = 001

builder = saved_model_builder.SavedModelBuilder("./"+str(model_version))
legacy_init_op = tf.group(tf.tables_initializer(), name="legacy_init_op")

builder.add_meta_graph_and_variables(
    sess,
    [tag_constants.SERVING],
    signature_def_map={signature_constants.DEFAULT_SERVING_SIGNATURE_DEF_KEY:classification_signature},
    legacy_init_op=legacy_init_op
)

builder.save()
mohammedayub44 commented 5 years ago

@ptamas88 Thanks. And the 'model' object did you recreate it or reloaded from the checkpoint file (translate.ckpt-372000) ?

Also, is there a way to get the inputs and outputs form meta graph file (.meta file) so i can pass it to the signature directly. ?