Xilinx / Vitis-AI

Vitis AI is Xilinx’s development stack for AI inference on Xilinx hardware platforms, including both edge devices and Alveo cards.
https://www.xilinx.com/ai
Apache License 2.0
1.47k stars 630 forks source link

Yolov8 Vai Quantizer #1328

Open farah-rostom98 opened 1 year ago

farah-rostom98 commented 1 year ago

Used the static quantizer on yolov8 architecture and the resulting model have changes in the shape of the weights which is not logical for example a layer original shape is (96,96,1,1) and the resulting shape is (3,)

Alexiazzf commented 1 year ago

Hi @farah-rostom98 ,

What is the kernal shape? and can you upload your code and log?

farah-rostom98 commented 1 year ago

I tried the Vai Onnx quantizer on this model Yolov8n-seg

1- used yolo export to export the model in onnx form with fp32 weights. 2- Prepared the caliration data reader 3- called vai_q_onnx.

the code vai_quantizer.zip

Log (docker image: xilinx/vitis-ai-pytorch-cpu:latest) RuntimeError: module compiled against API version 0x10 but this version of numpy is 0xf . Check the section C-API incompatibility at the Troubleshooting ImportError section at https://numpy.org/devdocs/user/troubleshooting-importerror.html#c-api-incompatibility for indications on how to solve this problem . images Finding optimal threshold for each tensor using PowerOfTwoMethod.MinMSE algorithm ... Calibrated and quantized model saved.