TrojanXu / yolov5-tensorrt

A tensorrt implementation of yolov5: https://github.com/ultralytics/yolov5
Apache License 2.0
190 stars 46 forks source link

Whether int8 quantization is supported ? #42

Open tensorflowt opened 4 years ago

tensorflowt commented 4 years ago

May I ask whether the current project supports INT8 quantization? If so, how? Currently onlyFT16, FT32 quantification is supported, right?

TrojanXu commented 4 years ago

Correct, but I think you can do PTQ using TensorRT calibration. Not implemented in this repo and not planned to do so in short phase.