Closed hahmad2008 closed 4 years ago
I was able to get this working by running the following code:
!pip install tf-nightly
import tensorflow as tf
## TFLite Conversion
# Before conversion, fix the model input size
model = tf.saved_model.load("saved_model")
model.signatures[tf.saved_model.DEFAULT_SERVING_SIGNATURE_DEF_KEY].inputs[0].set_shape([1, 300, 300, 3])
tf.saved_model.save(model, "saved_model_updated", signatures=model.signatures[tf.saved_model.DEFAULT_SERVING_SIGNATURE_DEF_KEY])
# Convert
converter = tf.lite.TFLiteConverter.from_saved_model(saved_model_dir='saved_model_updated', signature_keys=['serving_default'])
converter.optimizations = [tf.lite.Optimize.DEFAULT]
converter.target_spec.supported_ops = [tf.lite.OpsSet.TFLITE_BUILTINS, tf.lite.OpsSet.SELECT_TF_OPS]
tflite_model = converter.convert()
## TFLite Interpreter to check input shape
interpreter = tf.lite.Interpreter(model_content=tflite_model)
interpreter.allocate_tensors()
# Get input and output tensors.
input_details = interpreter.get_input_details()
output_details = interpreter.get_output_details()
# Test the model on random input data.
input_shape = input_details[0]['shape']
print(input_shape)
[ 1 300 300 3]
Feel free to re-open this if you still face an issue.
@MeghnaNatraj Thank you so much :) I really appreciate it. it works now 👯
Try this:
model = tf.saved_model.load("dir/to/model") concrete_func = model.signatures[ tf.saved_model.DEFAULT_SERVING_SIGNATURE_DEF_KEY] concrete_func.inputs[0].set_shape([None, 224, 224, 3]) #your input size converter = tf.lite.TFLiteConverter.from_concrete_functions([concrete_func]) tflite_model = converter.convert() with open('model.tflite', 'wb') as f: f.write(tflite_model)
@utkrist-karky Thanks, your approach worked for me.
@MeghnaNatraj When saving the updated SavedModel, the input shape is not retained for me:
model = tf.saved_model.load("saved_model") model.signatures[tf.saved_model.DEFAULT_SERVING_SIGNATURE_DEF_KEY].inputs[0].set_shape([1, 300, 300, 3]) tf.saved_model.save(model, "saved_model_updated", signatures=model.signatures[tf.saved_model.DEFAULT_SERVING_SIGNATURE_DEF_KEY]) ... model = tf.saved_model.load("saved_model_updated") model.signatures[tf.saved_model.DEFAULT_SERVING_SIGNATURE_DEF_KEY].inputs[0].shape
TensorShape([None, None, None, 3])
Update, I found the issue:
In from_concrete_functions
I was passing model
to avoid a scope issue when instantiating converter
inside of a function.
concrete_func.inputs[0].set_shape(SHAPE)
converter = tf2.lite.TFLiteConverter.from_concrete_functions([concrete_func], model)
The workaround is to create a backref to the model using
concrete_func._backref_to_saved_model = model
@utkrist-karky Your suggestion worked for me. Thanks.
I was able to get this working by running the following code:
!pip install tf-nightly import tensorflow as tf ## TFLite Conversion # Before conversion, fix the model input size model = tf.saved_model.load("saved_model") model.signatures[tf.saved_model.DEFAULT_SERVING_SIGNATURE_DEF_KEY].inputs[0].set_shape([1, 300, 300, 3]) tf.saved_model.save(model, "saved_model_updated", signatures=model.signatures[tf.saved_model.DEFAULT_SERVING_SIGNATURE_DEF_KEY]) # Convert converter = tf.lite.TFLiteConverter.from_saved_model(saved_model_dir='saved_model_updated', signature_keys=['serving_default']) converter.optimizations = [tf.lite.Optimize.DEFAULT] converter.target_spec.supported_ops = [tf.lite.OpsSet.TFLITE_BUILTINS, tf.lite.OpsSet.SELECT_TF_OPS] tflite_model = converter.convert() ## TFLite Interpreter to check input shape interpreter = tf.lite.Interpreter(model_content=tflite_model) interpreter.allocate_tensors() # Get input and output tensors. input_details = interpreter.get_input_details() output_details = interpreter.get_output_details() # Test the model on random input data. input_shape = input_details[0]['shape'] print(input_shape)
[ 1 300 300 3]
Feel free to re-open this if you still face an issue.
Thanks, it works for me in tensorflow2.3 . Other verion of tensorflow like 2.8 failed to do that. However, it seems that using 'model = tf.saved_model.load()' is necessary, the model can not be load like keras.models.Model().
System information
Command used to run the converter or code if you’re using the Python API this Link for code I used in converting the saved model to TensorFlow lite
The output from the converter invocation
The original model I used is saved model, It is the same as all models in the model zoo.
I was training the object detection API and run it successfully with original .pb saved model (using this backbone model)
Any recommendations here? Thanks.