microsoft / onnxruntime

ONNX Runtime: cross-platform, high performance ML inferencing and training accelerator
https://onnxruntime.ai
MIT License
14.77k stars 2.94k forks source link

olive quantization added #22854

Closed samuel100 closed 2 days ago

samuel100 commented 6 days ago

Blog post on how to use olive to find out if quantizing before fine-tuning provides better model quality.