onnx batch inference failed! Non-zero status code returned while running Split node. Name:'Split_30' Status Message: Cannot split using values in 'split' attribute #534
I exported a detr onnx with dynamic batch_size, height, width by the script test_all.py
I tried this onnx to do batch inference, however following error occured,
2022-09-16 15:46:04.766418294 [E:onnxruntime:, sequential_executor.cc:346 Execute] Non-zero status code returned while running Split node. Name:'Split_30' Status Message: Cannot split using values in 'split' attribute. Axis=0 Input shape={4,3,800,1204} NumOutputs=1 Num entries in 'split' (must equal number of outputs) was 1 Sum of sizes in 'split' (must equal size of selected axis) was
Traceback (most recent call last):
File "/home/xxxx/workspace/project/detection/detr/run_onnx_batch.py", line 161, in
main(args)
File "/home/xxxx/workspace/project/detection/detr/run_onnx_batch.py", line 114, in main
outputs = session.run(None, inp)
File "/home/xxxx/.local/lib/python3.6/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 192, in run
return self._sess.run(output_names, input_feed, run_options)
onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Non-zero status code returned while running Split node. Name:'Split_30' Status Message: Cannot split using values in 'split' attribute. Axis=0 Input shape={4,3,800,1204} NumOutputs=1 Num entries in 'split' (must equal number of outputs) was 1 Sum of sizes in 'split' (must equal size of selected axis) was 1
How to make this onnx do batch inference successfully?
Thanks a lot
I exported a detr onnx with dynamic batch_size, height, width by the script test_all.py I tried this onnx to do batch inference, however following error occured,
2022-09-16 15:46:04.766418294 [E:onnxruntime:, sequential_executor.cc:346 Execute] Non-zero status code returned while running Split node. Name:'Split_30' Status Message: Cannot split using values in 'split' attribute. Axis=0 Input shape={4,3,800,1204} NumOutputs=1 Num entries in 'split' (must equal number of outputs) was 1 Sum of sizes in 'split' (must equal size of selected axis) was Traceback (most recent call last): File "/home/xxxx/workspace/project/detection/detr/run_onnx_batch.py", line 161, in
main(args)
File "/home/xxxx/workspace/project/detection/detr/run_onnx_batch.py", line 114, in main
outputs = session.run(None, inp)
File "/home/xxxx/.local/lib/python3.6/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 192, in run
return self._sess.run(output_names, input_feed, run_options)
onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Non-zero status code returned while running Split node. Name:'Split_30' Status Message: Cannot split using values in 'split' attribute. Axis=0 Input shape={4,3,800,1204} NumOutputs=1 Num entries in 'split' (must equal number of outputs) was 1 Sum of sizes in 'split' (must equal size of selected axis) was 1
How to make this onnx do batch inference successfully? Thanks a lot