microsoft / onnxruntime-inference-examples

Examples for using ONNX Runtime for machine learning inferencing.
MIT License
1.07k stars 312 forks source link

Remove Q & DQ since Onnx wrapper model has them already #405

Closed HectorSVC closed 3 months ago

HectorSVC commented 3 months ago

Update according to Ort PR https://github.com/microsoft/onnxruntime/pull/20107

The wrapper Onnx model generated from native QNN context binary already have Q, DQ nodes if the inputs/outputs are quantized data. So the example application doesn't need to quantize inputs and dequantize the outputs any more.