Open SuyueLiu opened 1 year ago
Have you converted the ONNX model to a format that supports batch inference? (where input_shape[0] is in string type)
Have you converted the ONNX model to a format that supports batch inference? (where input_shape[0] is in string type)
yes I did
Any updates???
Did you solve this error? Why ignoring?
when I run onnx_helper to check the trained onnx model(generate by the default code), I got an error:
Traceback (most recent call last): File "onnx_helper.py", line 251, in
err = handler.check(args.track)
File "onnx_helper.py", line 166, in check
batch_result = self.check_batch(test_img)
File "onnx_helper.py", line 200, in check_batch
net_out = self.session.run(self.output_names, {self.input_name: blob})[0]
onnxruntime.capi.onnxruntime_pybind11_state.InvalidArgument: [ONNXRuntimeError] : 2 : INVALID_ARGUMENT : Non-zero status code returned while running Gemm node. Name:'Gemm_185' Status Message: GEMM: Dimension mismatch, W: {512,512} K: 16384 N:512
input: [None, 3, 112, 112] The onnx_helper blob shape: [32, 3, 112, 112]
Does any one know how to solve this problem?