danielsc / azureml-workshop-2019

AzureML Workshop for the 2019 Euro Tour
MIT License
30 stars 44 forks source link

[Bug:ONNX-Runtime-Python-Helper] Scoring with ONNX model is very slow compared to original Scikit-Learn model #64

Open CESARDELATORRE opened 4 years ago

CESARDELATORRE commented 4 years ago

Scoring with the exported ONNX model is very slow compared to original Scikit-Learn model:

When doing 294 predictions with the exported ONNX model needs 1.3 secs but with the original Scikit-Learn model it just needs 0.4 sec. So, the exported ONNX model is around 350% slower..

294 predictions from Test dataset:
Time for predictions with Scikit-Learn model: --- **0.48 seconds ---
Time for predictions with ONNX model: --- **1.27 seconds ---

Confirmed by Yunsong Bai that "Currently in the onnx inference helper it uses per record to feed data in the onnxruntime, we used this mode since there was errors found in previous ort version when feeding data in batch."

The onnx inference helper needs to be fixed and use batch mode by default when loading the data.

CESARDELATORRE commented 4 years ago

I filed a bug here: https://msdata.visualstudio.com/Vienna/_workitems/edit/587876