Closed chadrockey closed 2 months ago
Hi,
Could you please try with the ExportArchive
option by following below code sample.
model = keras.applications.MobileNetV3Small(input_shape=(100, 100, 3))
# Export the model with a custom endpoint
export_path = 'tflite_saved_model'
export_archive = keras.export.ExportArchive()
export_archive.add_endpoint(
name="serve",
fn=model.call,
input_signature=[tf.TensorSpec(shape=(None, 100, 100, 3), dtype=tf.float32)],
)
export_archive.write_out(export_path)
# Convert the saved model using TFLiteConverter
converter = tf.lite.TFLiteConverter.from_saved_model(export_path)
converter.target_spec.supported_ops = [
tf.lite.OpsSet.TFLITE_BUILTINS, # enable TensorFlow Lite ops.
tf.lite.OpsSet.SELECT_TF_OPS # enable TensorFlow ops.
]
tflite_model = converter.convert()
# Print the signatures from the converted model
interpreter = tf.lite.Interpreter(model_content=tflite_model)
signatures = interpreter.get_signature_list()
print(signatures)
Hi,
Could you please try with the
ExportArchive
option by following below code sample.model = keras.applications.MobileNetV3Small(input_shape=(100, 100, 3)) # Export the model with a custom endpoint export_path = 'tflite_saved_model' export_archive = keras.export.ExportArchive() export_archive.add_endpoint( name="serve", fn=model.call, input_signature=[tf.TensorSpec(shape=(None, 100, 100, 3), dtype=tf.float32)], ) export_archive.write_out(export_path) # Convert the saved model using TFLiteConverter converter = tf.lite.TFLiteConverter.from_saved_model(export_path) converter.target_spec.supported_ops = [ tf.lite.OpsSet.TFLITE_BUILTINS, # enable TensorFlow Lite ops. tf.lite.OpsSet.SELECT_TF_OPS # enable TensorFlow ops. ] tflite_model = converter.convert() # Print the signatures from the converted model interpreter = tf.lite.Interpreter(model_content=tflite_model) signatures = interpreter.get_signature_list() print(signatures)
Hello, I was getting an issue when calling export() with a DeepLabV3Plus model created with the from_preset() method and this solution worked. Thank you. Output I was getting:
return fn(*args, **kwargs)
TypeError: Exception encountered when calling UpSampling2D.call().
unsupported operand type(s) for *: 'NoneType' and 'int'
Arguments received by UpSampling2D.call():
• inputs=tf.Tensor(shape=(None, None, None, 256), `dtype=float32)
It seems model.export() might be broken for some keras_cv models.
@luca-mala thanks for testing, I am behind in verifying their suggestion.
Could we get more information on what the difference between export and export archive is? I'm glad there's potentially a workaround, but I would still expect export to work, right?
@chadrockey ,
The advantage of ExportArchive
over Export
is that ExportArchive
has training information including training metadata, custom objects, and optimizer state along with the model architecture which is useful for resuming training or transferring models between environments.
The export_archive method is generally more comprehensive and is better for scenarios where you need to preserve the full state of the model, including training information. The export method is more focused on creating deployment-ready versions of the model.
This issue is stale because it has been open for 14 days with no activity. It will be closed if no further activity occurs. Thank you.
This issue was closed because it has been inactive for 28 days. Please reopen if you'd like to work on this further.
Current Behavior:
When trying to export a model that uses Image Classifier to Tensorflow saved model format, it will fail with either:
OR (maybe when I tried a fix image size):
AttributeError: 'ImageClassifier' object has no attribute '_get_save_spec'
Expected Behavior:
I believe keras3 should be able to export to tensorflow saved model format. Otherwise, I would expect the tflite conversion to work without the intermediate save.
Steps To Reproduce:
https://colab.research.google.com/gist/chadrockey/b2e9fbe7cefc429967791b6c17be2341/imageclassiferexport.ipynb
Version:
3.3.3 keras_cv 0.9.0