PINTO0309 / onnx2tf

Self-Created Tools to convert ONNX files (NCHW) to TensorFlow/TFLite/Keras format (NHWC). The purpose of this tool is to solve the massive Transpose extrapolation problem in onnx-tensorflow (onnx-tf). I don't need a Star, but give me a pull request.
MIT License
713 stars 73 forks source link

Onnx in tflite on Windows11 #540

Closed Dmiriy closed 1 year ago

Dmiriy commented 1 year ago

Issue Type

Feature Request, Others

OS

Windows

onnx2tf version number

1.18.14

onnx version number

1.14.1

onnxruntime version number

1.14.1

onnxsim (onnx_simplifier) version number

0.4.33

tensorflow version number

2.14.1

Download URL for ONNX

https://huggingface.co/alphacep/vosk-model-small-ru/blob/main/am/encoder.onnx

Parameter Replacement JSON

na

Description

1.research 2.

INFO: 5 / 3395
INFO: onnx_op_type: Conv onnx_op_name: /encoder_embed/conv/conv.0/Conv
INFO:  input_name.1: /encoder_embed/Unsqueeze_output_0 shape: ['N', 1, 'T', 80] dtype: float32
INFO:  input_name.2: encoder_embed.conv.0.weight shape: [8, 1, 3, 3] dtype: float32
INFO:  input_name.3: encoder_embed.conv.0.bias shape: [8] dtype: float32
INFO:  output_name.1: /encoder_embed/conv/conv.0/Conv_output_0 shape: ['N', 8, 'unk__0', 80] dtype: float32
ERROR: The trace log is below.
Traceback (most recent call last):
  File "C:\Users\pastukhovdy\python\lib\site-packages\onnx2tf\utils\common_functions.py", line 309, in print_wrapper_func      
    result = func(*args, **kwargs)
  File "C:\Users\pastukhovdy\python\lib\site-packages\onnx2tf\utils\common_functions.py", line 382, in inverted_operation_enable_disable_wrapper_func
    result = func(*args, **kwargs)
  File "C:\Users\pastukhovdy\python\lib\site-packages\onnx2tf\ops\Conv.py", line 453, in make_node
    conv_bias(
  File "C:\Users\pastukhovdy\python\lib\site-packages\onnx2tf\ops\Conv.py", line 302, in conv_bias
    tf.nn.convolution(
  File "C:\Users\pastukhovdy\python\lib\site-packages\tensorflow\python\util\traceback_utils.py", line 153, in error_handler   
    raise e.with_traceback(filtered_tb) from None
  File "C:\Users\pastukhovdy\python\lib\site-packages\keras\src\layers\core\tf_op_layer.py", line 119, in handle
    return TFOpLambda(op)(*args, **kwargs)
  File "C:\Users\pastukhovdy\python\lib\site-packages\keras\src\utils\traceback_utils.py", line 70, in error_handler
    raise e.with_traceback(filtered_tb) from None
ValueError: Exception encountered when calling layer "tf.nn.convolution" (type TFOpLambda).

Negative dimension size caused by subtracting 3 from 1 for '{{node tf.nn.convolution/convolution}} = Conv2D[T=DT_FLOAT, data_fo
rmat="NHWC", dilations=[1, 1, 1, 1], explicit_paddings=[], padding="VALID", strides=[1, 1, 1, 1], use_cudnn_on_gpu=true](Placeholder, tf.nn.convolution/convolution/filter)' with input shapes: [?,1,82,?], [3,3,1,8].

3.- 4.try the model on coral (TPU)

PINTO0309 commented 1 year ago

How can I get the inference to work correctly in onnxruntime before converting to TFLite? When I messily change to a fixed shape and run the inference, onnxruntime aborts.

onnxsim encoder.onnx encoder.onnx --overwrite-input-shape "x:1,100,80" "x_lens:1"

sit4onnx -if encoder.onnx -oep cpu

Traceback (most recent call last):
  File "/home/b920405/.local/bin/sit4onnx", line 8, in <module>
    sys.exit(main())
  File "/home/b920405/.local/lib/python3.10/site-packages/sit4onnx/onnx_inference_test.py", line 506, in main
    final_results = inference(
  File "/home/b920405/.local/lib/python3.10/site-packages/sit4onnx/onnx_inference_test.py", line 357, in inference
    results = onnx_session.run(
  File "/home/b920405/.local/lib/python3.10/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 220, in run
    return self._sess.run(output_names, input_feed, run_options)
onnxruntime.capi.onnxruntime_pybind11_state.InvalidArgument: [ONNXRuntimeError] : 2 : INVALID_ARGUMENT : Non-zero status code returned while running Expand node. Name:'/Expand' Status Message: invalid expand shape
Dmiriy commented 1 year ago

Thanks for the answer. I don't have a solution yet