onnx / onnx-tensorflow

Tensorflow Backend for ONNX
Other
1.27k stars 296 forks source link

Export tensorflow lite model don't use onnx model input names and output names #1047

Open linmeimei0512 opened 1 year ago

linmeimei0512 commented 1 year ago

My environment is:

My final goal: ONNX model convert to tflite.

Question: My ONNX model input name is [input], output name is [output1, output2] 2022-10-27 17-03-40 的螢幕擷圖

ONNX convert to Tensorflow is use SavedModel format in Tensorflow 2.x.

import onnx
from onnx_tf.backend import prepare

onnx_model = onnx.load(onnx_model_path)
onnx_tf_exporter = prepare(onnx_model)
onnx_tf_exporter.export_graph(tensorflow_model_output_path)

I display output model (saved_model.pb) in Netron, input name is not equal to ONNX. 2022-10-27 17-02-48 的螢幕擷圖

Then I convert the SavedModel to tflie.

import tensorflow as tf

converter = tf.lite.TFLiteConverter.from_saved_model(tensorflow_model_path)
converter.optimizations = [tf.lite.Optimize.DEFAULT]
tflite_model = converter.convert()
with open(tensorflow_lite_model_output_path, 'wb') as f:
    f.write(tflite_model)

Tflie model input name is equal to SavedModel [serving_default_input:0], output name is [StatefulPartitionedCall:1, StatefulPartitionedCall:1] 2022-10-27 17-18-23 的螢幕擷圖

Please how can I make tflite input name and output name equal to ONNX?

PINTO0309 commented 1 year ago

duplicate https://github.com/onnx/onnx-tensorflow/issues/984