microsoft / onnxruntime

ONNX Runtime: cross-platform, high performance ML inferencing and training accelerator
https://onnxruntime.ai
MIT License
14.72k stars 2.93k forks source link

The example for quantization doesn't work #9069

Open mRaskovic opened 3 years ago

mRaskovic commented 3 years ago

Describe the bug The example from the readme file for model quantization doesn't work. Location: https://github.com/microsoft/onnxruntime/tree/master/onnxruntime/python/tools/quantization/E2E_example_model/image_classification/cpu

It crashes for the command: python run.py --input_model mobilenetv2-7.onnx --output_model mobilenetv2-7.quant.onnx --calibrate_dataset ./test_images/

The error I'm getting: image

But it works properly for Resnet: python run.py --input_model .\resnet50-v1-13.onnx --output_model \resnet50-v1-13.quant.onnx --calibrate_dataset ./test_images/

Urgency Not urgent

System information

To Reproduce Clone the repository and run the command from the readme file for the specified script.

Expected behavior It should perform the network quantization and compare inference times for FP32 and the quantized versions. The same way it works for resnet.

yufenglee commented 3 years ago

This bug was fixed in master #8788. Please try our nightly-build or older version 1.7.

stale[bot] commented 2 years ago

This issue has been automatically marked as stale due to inactivity and will be closed in 7 days if no further activity occurs. If further support is needed, please provide an update and/or more details.