onnx / onnx-tensorflow

Tensorflow Backend for ONNX
Other
1.28k stars 297 forks source link

Empty tensor dict #975

Closed piseabhijeet closed 3 years ago

piseabhijeet commented 3 years ago

Describe the bug

I am trying to convert a pytorch model to onnx and then to tensorflow serving format. My model gets saved in onnx and in pb file as well however, i am not able to serve it. When i tried to debug the cause, i noticed after loading the onnx model and preparing it, it returns an empty dictionary of no tensors

To Reproduce

Create a multi output pytorch model (one that has multiple outputs in the form of a dictionary returned in the forward pass), convert to onnx, prepare and load.

model = onnx.load(onnx_file) tf_rep = prepare(model) tf_dict = tf_rep.tensor_dict input_tensor_names = {key: tf_dict[key] for key in input_keys} output_tensor_names = {key: tf_dict[key] for key in output_keys} tf_rep.export_graph(meta_file)

Source code taken from: https://gist.github.com/bfreskura/bd3602b1dd3e076b96e6c4da7e8f9a82#file-pytorch_to_tf_serving-py

A self-contained piece of code that can demonstrate the problem is required.

Please do not expect us to have PyTorch, Caffe2 installed.

If a model exported from PyTorch and Caffe2 is having trouble in ONNX-TF, use the next section to attach the model.

ONNX model file

If applicable, attach the onnx model file in question using Gist, DropBox or Google Drive.

Python, ONNX, ONNX-TF, Tensorflow version

This section can be obtained by running get_version.py from util folder.

Additional context

Add any other context about the problem here.

chudegao commented 3 years ago

To generate the tensor dict, need add parameter as below: tf_rep = prepare(model,gen_tensor_dict=True)

piseabhijeet commented 3 years ago

Thank you for the prompt response @chudegao . It worked :)