DerryHub / BEVFormer_tensorrt

BEVFormer inference on TensorRT, including INT8 Quantization and Custom TensorRT Plugins (float/half/half2/int8).
Apache License 2.0
425 stars 69 forks source link

qat model #73

Open CodeWorldChanged opened 1 year ago

CodeWorldChanged commented 1 year ago

Can you provide me with a qat(quantizaiton aware train) model.thanks

admyxs commented 7 months ago

Can you provide me with a qat(quantizaiton aware train) model.thanks

i get same question with you

i have train qat model, but the NDS and mAP is very poor

do you train model by QAT?