Open ferqui opened 1 year ago
Describe the bug
Hi, I want to convert an onnx model to Tensorflow, but when I try to convert it, using either the CLI or by script, it through the following error.
File "/usr/local/lib/python3.7/dist-packages/onnx_tf/backend_tf_module.py", line 99, in __call__ * output_ops = self.backend._onnx_node_to_tensorflow_op(onnx_node, File "/usr/local/lib/python3.7/dist-packages/onnx_tf/backend.py", line 347, in _onnx_node_to_tensorflow_op * return handler.handle(node, tensor_dict=tensor_dict, strict=strict) File "/usr/local/lib/python3.7/dist-packages/onnx_tf/handlers/handler.py", line 59, in handle * return ver_handle(node, **kwargs) File "/usr/local/lib/python3.7/dist-packages/onnx_tf/handlers/backend/conv.py", line 15, in version_11 * return cls.conv(node, kwargs["tensor_dict"]) File "/usr/local/lib/python3.7/dist-packages/onnx_tf/handlers/backend/conv_mixin.py", line 30, in conv * x_rank = len(x.get_shape()) ValueError: Cannot take the length of shape with unknown rank.
To Reproduce
To reproduce this error I created a Google Colab notebook
ONNX model file
The ONNX file can be downloaded directly on the google colab script
Python, ONNX, ONNX-TF, Tensorflow version
- Python version: 3.7.13 (default, Apr 24 2022, 01:04:09) [GCC 7.5.0]
- ONNX version: 1.12.0
- ONNX-TF version: 1.10.0
- Tensorflow version: 2.8.2
Hello, I am currently facing the same problem as you, in the process of converting a model from pytorch to onnx then to tflite. I have tried the solution for a similar situation that I have seen in the issue so far, but the problem still exists, do you have any good suggestions?
Having the same issue with similar setup. Looking a bit more into this I see that the output from prepare does not seem right: tf_rep.tensor_dict and tf_rep.graph are just empty. This is weird because tf_rep.inputs and tf_rep.outputs both list correct names, tf_rep.onnx_op_list is giving the right stats and so on...
Totally stuck here so any help will be appreciated!
Python version: 3.8.5 ONNX version: 1.12.0 ONNX-TF version: 1.10.0 Tensorflow version: 2.10
Seems like I was able to solve this problem by explicitly setting opset_version and dynamic_axes when saving from torch to onnx: torch.onnx.export( model, image, onnx_file, input_names=["input0"], output_names=["output0", "output1"], export_params=True, opset_version=12, do_constant_folding=True, dynamic_axes={ "input0": {0: "batch_size"}, "output0": {0: "batch_size"}, "output1": {0: "batch_size"}, }, )
My torch version is 1.12
I have confirmed that I can convert successfully with this tool I am creating. onnx2tf
I am only interested in the transformation of the model and not in checking if the inference results of the model are degraded or normal.
https://github.com/PINTO0309/onnx2tf
https://github.com/PINTO0309/onnx2tf/releases/tag/1.1.18
{
"format_version": 1,
"operations": [
{
"op_name": "Reshape_448",
"param_target": "outputs",
"param_name": "onnx::Add_817",
"post_process_transpose_perm": [0,2,3,1]
}
]
}
onnx2tf -i vae_encoder.onnx -prf replace.json
Having the same issue with similar setup and any help will be appreciated! https://github.com/snakers4/silero-vad/blob/v3.1/files/silero_vad.onnx
Python version: 3.8.10 ONNX version: 1.12.0 ONNX-TF version: 1.10.0 Tensorflow version: 2.10
Describe the bug
Hi, I want to convert an onnx model to Tensorflow, but when I try to convert it, using either the CLI or by script, it through the following error.
To Reproduce
To reproduce this error I created a Google Colab notebook
ONNX model file
The ONNX file can be downloaded directly on the google colab script
Python, ONNX, ONNX-TF, Tensorflow version