onnx / onnx-tensorflow

Tensorflow Backend for ONNX
Other
1.26k stars 298 forks source link

Error converting ONNX to TF file: Cannot take the length of shape with unknown rank. #1042

Open ferqui opened 1 year ago

ferqui commented 1 year ago

Describe the bug

Hi, I want to convert an onnx model to Tensorflow, but when I try to convert it, using either the CLI or by script, it through the following error.

File "/usr/local/lib/python3.7/dist-packages/onnx_tf/backend_tf_module.py", line 99, in __call__  *
        output_ops = self.backend._onnx_node_to_tensorflow_op(onnx_node,
    File "/usr/local/lib/python3.7/dist-packages/onnx_tf/backend.py", line 347, in _onnx_node_to_tensorflow_op  *
        return handler.handle(node, tensor_dict=tensor_dict, strict=strict)
    File "/usr/local/lib/python3.7/dist-packages/onnx_tf/handlers/handler.py", line 59, in handle  *
        return ver_handle(node, **kwargs)
    File "/usr/local/lib/python3.7/dist-packages/onnx_tf/handlers/backend/conv.py", line 15, in version_11  *
        return cls.conv(node, kwargs["tensor_dict"])
    File "/usr/local/lib/python3.7/dist-packages/onnx_tf/handlers/backend/conv_mixin.py", line 30, in conv  *
        x_rank = len(x.get_shape())

    ValueError: Cannot take the length of shape with unknown rank.

To Reproduce

To reproduce this error I created a Google Colab notebook

ONNX model file

The ONNX file can be downloaded directly on the google colab script

Python, ONNX, ONNX-TF, Tensorflow version

Rechargeablezz commented 1 year ago

Describe the bug

Hi, I want to convert an onnx model to Tensorflow, but when I try to convert it, using either the CLI or by script, it through the following error.

File "/usr/local/lib/python3.7/dist-packages/onnx_tf/backend_tf_module.py", line 99, in __call__  *
        output_ops = self.backend._onnx_node_to_tensorflow_op(onnx_node,
    File "/usr/local/lib/python3.7/dist-packages/onnx_tf/backend.py", line 347, in _onnx_node_to_tensorflow_op  *
        return handler.handle(node, tensor_dict=tensor_dict, strict=strict)
    File "/usr/local/lib/python3.7/dist-packages/onnx_tf/handlers/handler.py", line 59, in handle  *
        return ver_handle(node, **kwargs)
    File "/usr/local/lib/python3.7/dist-packages/onnx_tf/handlers/backend/conv.py", line 15, in version_11  *
        return cls.conv(node, kwargs["tensor_dict"])
    File "/usr/local/lib/python3.7/dist-packages/onnx_tf/handlers/backend/conv_mixin.py", line 30, in conv  *
        x_rank = len(x.get_shape())

    ValueError: Cannot take the length of shape with unknown rank.

To Reproduce

To reproduce this error I created a Google Colab notebook

ONNX model file

The ONNX file can be downloaded directly on the google colab script

Python, ONNX, ONNX-TF, Tensorflow version

  • Python version: 3.7.13 (default, Apr 24 2022, 01:04:09) [GCC 7.5.0]
  • ONNX version: 1.12.0
  • ONNX-TF version: 1.10.0
  • Tensorflow version: 2.8.2

Hello, I am currently facing the same problem as you, in the process of converting a model from pytorch to onnx then to tflite. I have tried the solution for a similar situation that I have seen in the issue so far, but the problem still exists, do you have any good suggestions?

nekitmm commented 1 year ago

Having the same issue with similar setup. Looking a bit more into this I see that the output from prepare does not seem right: tf_rep.tensor_dict and tf_rep.graph are just empty. This is weird because tf_rep.inputs and tf_rep.outputs both list correct names, tf_rep.onnx_op_list is giving the right stats and so on...

Totally stuck here so any help will be appreciated!

Python version: 3.8.5 ONNX version: 1.12.0 ONNX-TF version: 1.10.0 Tensorflow version: 2.10

nekitmm commented 1 year ago

Seems like I was able to solve this problem by explicitly setting opset_version and dynamic_axes when saving from torch to onnx: torch.onnx.export( model, image, onnx_file, input_names=["input0"], output_names=["output0", "output1"], export_params=True, opset_version=12, do_constant_folding=True, dynamic_axes={ "input0": {0: "batch_size"}, "output0": {0: "batch_size"}, "output1": {0: "batch_size"}, }, )

My torch version is 1.12

PINTO0309 commented 1 year ago

I have confirmed that I can convert successfully with this tool I am creating. onnx2tf I am only interested in the transformation of the model and not in checking if the inference results of the model are degraded or normal. https://github.com/PINTO0309/onnx2tf https://github.com/PINTO0309/onnx2tf/releases/tag/1.1.18

bayleef1 commented 1 year ago

Having the same issue with similar setup and any help will be appreciated! https://github.com/snakers4/silero-vad/blob/v3.1/files/silero_vad.onnx

Python version: 3.8.10 ONNX version: 1.12.0 ONNX-TF version: 1.10.0 Tensorflow version: 2.10