onnx / tutorials

Tutorials for creating and using ONNX models
Apache License 2.0
3.37k stars 629 forks source link

Batch inference with onnx doesn't give detections image by image #180

Open ashnair1 opened 4 years ago

ashnair1 commented 4 years ago

Performing inference on a batch of images, via the following command in OnnxCaffe2Import.ipynb works. outputs = caffe2.python.onnx.backend.run_model(model, [img])

But the detections are all stacked together so I can't tell which detections belong to which image. So for example, if I passed in 3 images, I'll receive 20 detections without knowing how many detections belong to the 1st image, how many belong to the 2nd image etc.

How can this be addressed?