hustvl / YOLOP

You Only Look Once for Panopitic Driving Perception.(MIR2022)
MIT License
1.92k stars 411 forks source link

ONNX simplification crash (in export_onnx.py) #110

Open sahamitul opened 2 years ago

sahamitul commented 2 years ago

python export_onnx.py --height 640 --width 640

gives this crash during simplification stage:- ... simplifying with onnx-simplifier 0.3.7... Traceback (most recent call last): File "export_onnx.py", line 178, in model_onnx, check = onnxsim.simplify(model_onnx, check_n=3) File "/home/msaha/.local/lib/python3.6/site-packages/onnxsim/onnx_simplifier.py", line 492, in simplify model = fixed_point(model, infer_shapes_and_optimize, constant_folding) File "/home/msaha/.local/lib/python3.6/site-packages/onnxsim/onnx_simplifier.py", line 393, in fixed_point x = func_b(x) File "/home/msaha/.local/lib/python3.6/site-packages/onnxsim/onnx_simplifier.py", line 486, in constant_folding custom_lib=custom_lib) File "/home/msaha/.local/lib/python3.6/site-packages/onnxsim/onnx_simplifier.py", line 232, in forward_for_node_outputs custom_lib=custom_lib) File "/home/msaha/.local/lib/python3.6/site-packages/onnxsim/onnx_simplifier.py", line 198, in forward ), sess_options=sess_options, providers=['CPUExecutionProvider']) File "/home/msaha/.local/lib/python3.6/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 335, in init self._create_inference_session(providers, provider_options, disabled_optimizers) File "/home/msaha/.local/lib/python3.6/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 370, in _create_inference_session sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model) onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node (Mul_948) Op (Mul) [ShapeInferenceError] Incompatible dimensions

Any ideas? Thanks!

Yaoxingtian commented 2 years ago

python export_onnx.py --height 640 --width 640

gives this crash during simplification stage:- ... simplifying with onnx-simplifier 0.3.7... Traceback (most recent call last): File "export_onnx.py", line 178, in model_onnx, check = onnxsim.simplify(model_onnx, check_n=3) File "/home/msaha/.local/lib/python3.6/site-packages/onnxsim/onnx_simplifier.py", line 492, in simplify model = fixed_point(model, infer_shapes_and_optimize, constant_folding) File "/home/msaha/.local/lib/python3.6/site-packages/onnxsim/onnx_simplifier.py", line 393, in fixed_point x = func_b(x) File "/home/msaha/.local/lib/python3.6/site-packages/onnxsim/onnx_simplifier.py", line 486, in constant_folding custom_lib=custom_lib) File "/home/msaha/.local/lib/python3.6/site-packages/onnxsim/onnx_simplifier.py", line 232, in forward_for_node_outputs custom_lib=custom_lib) File "/home/msaha/.local/lib/python3.6/site-packages/onnxsim/onnx_simplifier.py", line 198, in forward ), sess_options=sess_options, providers=['CPUExecutionProvider']) File "/home/msaha/.local/lib/python3.6/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 335, in init self._create_inference_session(providers, provider_options, disabled_optimizers) File "/home/msaha/.local/lib/python3.6/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 370, in _create_inference_session sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model) onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node (Mul_948) Op (Mul) [ShapeInferenceError] Incompatible dimensions

Any ideas? Thanks!

请问你解决了吗? 我也是这个问题

Yaoxingtian commented 2 years ago

我按官方的环境配置新建,没有问题

lzm2275965881 commented 2 years ago

运行(Run)export_onnx.py,遇到这个问题(Error): 2022-04-09 18:10:09.4948181 [E:onnxruntime:, sequential_executor.cc:346 onnxruntime::SequentialExecutor::Execute] Non-zero status code returned while running Mul node. Name:'Mul_1340' Status Message: D:\a_work\1\s\onnxruntime\core/providers/cpu/math/element_wise_ops.h:505 onnxruntime::BroadcastIterator::Append axis == 1 || axis == largest was false. Attempting to broadcast an axis by a dimension other than 1. 3 by 20

Traceback (most recent call last): File "D:/YOLOP-main/export_onnx.py", line 178, in model_onnx, check = onnxsim.simplify(model_onnx, check_n=3) File "C:\Users\86135\AppData\Roaming\Python\Python36\site-packages\onnxsim\onnx_simplifier.py", line 492, in simplify model = fixed_point(model, infer_shapes_and_optimize, constant_folding) File "C:\Users\86135\AppData\Roaming\Python\Python36\site-packages\onnxsim\onnx_simplifier.py", line 393, in fixed_point x = func_b(x) File "C:\Users\86135\AppData\Roaming\Python\Python36\site-packages\onnxsim\onnx_simplifier.py", line 486, in constant_folding custom_lib=custom_lib) File "C:\Users\86135\AppData\Roaming\Python\Python36\site-packages\onnxsim\onnx_simplifier.py", line 232, in forward_for_node_outputs custom_lib=custom_lib) File "C:\Users\86135\AppData\Roaming\Python\Python36\site-packages\onnxsim\onnx_simplifier.py", line 216, in forward outputs, inputs, run_options=run_options))) File "D:\anaconda3\envs\pytorch\lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 192, in run return self._sess.run(output_names, input_feed, run_options) onnxruntime.capi.onnxruntime_pybind11_state.RuntimeException: [ONNXRuntimeError] : 6 : RUNTIME_EXCEPTION : Non-zero status code returned while running Mul node. Name:'Mul_1340' Status Message: D:\a_work\1\s\onnxruntime\core/providers/cpu/math/element_wise_ops.h:505 onnxruntime::BroadcastIterator::Append axis == 1 || axis == largest was false. Attempting to broadcast an axis by a dimension other than 1. 3 by 20

不知道怎么解决,求助啊啊啊(I don't kown how to solve this problem, help!!!)

niuniandajiba commented 2 years ago

sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_mode onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node (Mul_558) Op (Mul) [ShapeInferenceError] Incompatible dimensions I ran into exactly this kind of error while running the export_onnx.py, exporting onnx model in this environment:

Then i try to export in another environment and it magically worked.

Hope this could help you. 可以试下换个环境跑export_onnx.py,可能有用。

sahamitul commented 2 years ago

Thank you @niuniandajiba, @Yaoxingtian ! Let me try ...

sahamitul commented 2 years ago

Can the repo owners clarify what versions they used for OS, onnx, onnx-simplifier, onnxruntime, anything else needed and is not clear in requirements.txt ? @Riser6

sahamitul commented 2 years ago

As suggested by @niuniandajiba, I tried:-

onnx 1.9.0 onnx-simplifier 0.3.6 onnxruntime 1.7.0 pytorch 1.7.1+cu102 torchvision 0.8.2

WITH Ubuntu 18.04.6 LTS AND python 3.6.9 (python 3.7.X should be fine too)

and it seems to convert.

elitedevbot commented 4 months ago

Hello @sahamitul i am currently working on yolop model and i want to run it on beaglebone ai64(tda4vm) , can you please tell me how to compile it for beaglebone and run model on it.