tensorflow / tensorflow

An Open Source Machine Learning Framework for Everyone
https://tensorflow.org
Apache License 2.0
186.22k stars 74.29k forks source link

Input size of converted lite model doesn't match the original model input size #42114

Closed hahmad2008 closed 4 years ago

hahmad2008 commented 4 years ago

System information

Command used to run the converter or code if you’re using the Python API this Link for code I used in converting the saved model to TensorFlow lite

!pip install tf-nightly
model_dir=saved_model'
converter = tf.lite.TFLiteConverter.from_saved_model(model_dir,signature_keys=['serving_default'],)

converter.optimizations = [tf.lite.Optimize.DEFAULT]
converter.experimental_new_converter = True
converter.target_spec.supported_ops = [tf.lite.OpsSet.TFLITE_BUILTINS, tf.lite.OpsSet.SELECT_TF_OPS]

tflite_model = converter.convert()

#open("saved_model/converted_model.tflite", "wb").write(tflite_model)
open('tflite_model.tflite','wb').write(tflite_model)

The output from the converter invocation

The original model I used is saved model, It is the same as all models in the model zoo.

I was training the object detection API and run it successfully with original .pb saved model (using this backbone model)

Any recommendations here? Thanks.

MeghnaNatraj commented 4 years ago

I was able to get this working by running the following code:

!pip install tf-nightly
import tensorflow as tf

## TFLite Conversion
# Before conversion, fix the model input size
model = tf.saved_model.load("saved_model")
model.signatures[tf.saved_model.DEFAULT_SERVING_SIGNATURE_DEF_KEY].inputs[0].set_shape([1, 300, 300, 3])
tf.saved_model.save(model, "saved_model_updated", signatures=model.signatures[tf.saved_model.DEFAULT_SERVING_SIGNATURE_DEF_KEY])
# Convert
converter = tf.lite.TFLiteConverter.from_saved_model(saved_model_dir='saved_model_updated', signature_keys=['serving_default'])
converter.optimizations = [tf.lite.Optimize.DEFAULT]
converter.target_spec.supported_ops = [tf.lite.OpsSet.TFLITE_BUILTINS, tf.lite.OpsSet.SELECT_TF_OPS]
tflite_model = converter.convert()

## TFLite Interpreter to check input shape
interpreter = tf.lite.Interpreter(model_content=tflite_model)
interpreter.allocate_tensors()

# Get input and output tensors.
input_details = interpreter.get_input_details()
output_details = interpreter.get_output_details()

# Test the model on random input data.
input_shape = input_details[0]['shape']
print(input_shape)
[  1 300 300   3]

Feel free to re-open this if you still face an issue.

google-ml-butler[bot] commented 4 years ago

Are you satisfied with the resolution of your issue? Yes No

hahmad2008 commented 4 years ago

@MeghnaNatraj Thank you so much :) I really appreciate it. it works now 👯

utkrist-karky commented 2 years ago

Try this: model = tf.saved_model.load("dir/to/model") concrete_func = model.signatures[ tf.saved_model.DEFAULT_SERVING_SIGNATURE_DEF_KEY] concrete_func.inputs[0].set_shape([None, 224, 224, 3]) #your input size converter = tf.lite.TFLiteConverter.from_concrete_functions([concrete_func]) tflite_model = converter.convert() with open('model.tflite', 'wb') as f: f.write(tflite_model)

creativesh commented 2 years ago

@utkrist-karky Thanks, your approach worked for me.

patrickjdarrow commented 1 year ago

@MeghnaNatraj When saving the updated SavedModel, the input shape is not retained for me:

model = tf.saved_model.load("saved_model") model.signatures[tf.saved_model.DEFAULT_SERVING_SIGNATURE_DEF_KEY].inputs[0].set_shape([1, 300, 300, 3]) tf.saved_model.save(model, "saved_model_updated", signatures=model.signatures[tf.saved_model.DEFAULT_SERVING_SIGNATURE_DEF_KEY]) ... model = tf.saved_model.load("saved_model_updated") model.signatures[tf.saved_model.DEFAULT_SERVING_SIGNATURE_DEF_KEY].inputs[0].shape

TensorShape([None, None, None, 3])

Update, I found the issue:

In from_concrete_functions I was passing model to avoid a scope issue when instantiating converter inside of a function.

concrete_func.inputs[0].set_shape(SHAPE)
converter = tf2.lite.TFLiteConverter.from_concrete_functions([concrete_func], model)

The workaround is to create a backref to the model using

 concrete_func._backref_to_saved_model = model
karthik87s commented 7 months ago

@utkrist-karky Your suggestion worked for me. Thanks.

Monaco12138 commented 5 months ago

I was able to get this working by running the following code:

!pip install tf-nightly
import tensorflow as tf

## TFLite Conversion
# Before conversion, fix the model input size
model = tf.saved_model.load("saved_model")
model.signatures[tf.saved_model.DEFAULT_SERVING_SIGNATURE_DEF_KEY].inputs[0].set_shape([1, 300, 300, 3])
tf.saved_model.save(model, "saved_model_updated", signatures=model.signatures[tf.saved_model.DEFAULT_SERVING_SIGNATURE_DEF_KEY])
# Convert
converter = tf.lite.TFLiteConverter.from_saved_model(saved_model_dir='saved_model_updated', signature_keys=['serving_default'])
converter.optimizations = [tf.lite.Optimize.DEFAULT]
converter.target_spec.supported_ops = [tf.lite.OpsSet.TFLITE_BUILTINS, tf.lite.OpsSet.SELECT_TF_OPS]
tflite_model = converter.convert()

## TFLite Interpreter to check input shape
interpreter = tf.lite.Interpreter(model_content=tflite_model)
interpreter.allocate_tensors()

# Get input and output tensors.
input_details = interpreter.get_input_details()
output_details = interpreter.get_output_details()

# Test the model on random input data.
input_shape = input_details[0]['shape']
print(input_shape)
[  1 300 300   3]

Feel free to re-open this if you still face an issue.

Thanks, it works for me in tensorflow2.3 . Other verion of tensorflow like 2.8 failed to do that. However, it seems that using 'model = tf.saved_model.load()' is necessary, the model can not be load like keras.models.Model().