microsoft / onnxruntime

ONNX Runtime: cross-platform, high performance ML inferencing and training accelerator
https://onnxruntime.ai
MIT License
14.35k stars 2.88k forks source link

[ONNXRuntimeError] : 2 : INVALID_ARGUMENT : Error while inferencing DLRM onnx model #6319

Open deepeshguru opened 3 years ago

deepeshguru commented 3 years ago

Describe the bug While inferencing DLRM onnx model using onnxruntime, I am getting below error:

InvalidArgument Traceback (most recent call last)

in 5 list_inputs = pickle.load(f) 6 sess = rt.InferenceSession("/home/deepesh/mlperf/model2/dlrm_s_pytorch.onnx") ----> 7 outputs = sess.run(output_names=["pred"], input_feed={'dense_x':list_inputs[0], 'offsets':list_inputs[1], 'indices':list_inputs[2]}) ~/.local/lib/python3.7/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py in run(self, output_names, input_feed, run_options) 122 output_names = [output.name for output in self._outputs_meta] 123 try: --> 124 return self._sess.run(output_names, input_feed, run_options) 125 except C.EPFail as err: 126 if self._enable_fallback: InvalidArgument: [ONNXRuntimeError] : 2 : INVALID_ARGUMENT : Non-zero status code returned while running Loop node. Name:'Loop_414' Status Message: Non-zero status code returned while running Gather node. Name:'Gather_420' Status Message: indices element out of data bounds, idx=11395 must be within the inclusive range [-11156,11155] **Urgency** none. **System information** - OS Platform and Distribution (e.g., Linux Ubuntu 16.04): Ubuntu 18.04.5 LTS - ONNX Runtime installed from (source or binary):binary (pip install onnxruntime) - ONNX Runtime version: 1.6.0 - Python version: 3.7 **To Reproduce** - Describe steps/code to reproduce the behavior. import onnxruntime as rt import pickle with open("list_inputs.pkl", 'rb') as f: list_inputs = pickle.load(f) sess = rt.InferenceSession("dlrm_s_pytorch.onnx") outputs = sess.run(output_names=["pred"], input_feed={'dense_x':list_inputs[0], 'offsets':list_inputs[1], 'indices':list_inputs[2]}) - Attach the ONNX model to the issue (where applicable) to expedite investigation. Please find model and input in attachment [DLRM.zip](https://github.com/microsoft/onnxruntime/files/5800073/DLRM.zip)
faxu commented 3 years ago

@deepeshguru were you able to figure out the issue or are you still experiencing this error with the latest version of ORT?