Closed Dmiriy closed 1 year ago
How can I get the inference to work correctly in onnxruntime before converting to TFLite? When I messily change to a fixed shape and run the inference, onnxruntime aborts.
onnxsim encoder.onnx encoder.onnx --overwrite-input-shape "x:1,100,80" "x_lens:1"
sit4onnx -if encoder.onnx -oep cpu
Traceback (most recent call last):
File "/home/b920405/.local/bin/sit4onnx", line 8, in <module>
sys.exit(main())
File "/home/b920405/.local/lib/python3.10/site-packages/sit4onnx/onnx_inference_test.py", line 506, in main
final_results = inference(
File "/home/b920405/.local/lib/python3.10/site-packages/sit4onnx/onnx_inference_test.py", line 357, in inference
results = onnx_session.run(
File "/home/b920405/.local/lib/python3.10/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 220, in run
return self._sess.run(output_names, input_feed, run_options)
onnxruntime.capi.onnxruntime_pybind11_state.InvalidArgument: [ONNXRuntimeError] : 2 : INVALID_ARGUMENT : Non-zero status code returned while running Expand node. Name:'/Expand' Status Message: invalid expand shape
Thanks for the answer. I don't have a solution yet
Issue Type
Feature Request, Others
OS
Windows
onnx2tf version number
1.18.14
onnx version number
1.14.1
onnxruntime version number
1.14.1
onnxsim (onnx_simplifier) version number
0.4.33
tensorflow version number
2.14.1
Download URL for ONNX
https://huggingface.co/alphacep/vosk-model-small-ru/blob/main/am/encoder.onnx
Parameter Replacement JSON
Description
1.research 2.
3.- 4.try the model on coral (TPU)