amd / RyzenAI-SW

MIT License
405 stars 65 forks source link

YoloV10 on Ryzen #91

Closed mht-sharma closed 5 months ago

mht-sharma commented 5 months ago

Description

I was trying to run the Yolov10 model on Ryzen. I quantized the ONNX model using the VitisAI ONNX Quantizer. I used the following configuration as per the VitisAI ONNX Quantizer documentation.

To my surprise, only 2% of the operators ran on the DPU. Upon checking the Vitisai_ep_report.json file, I found that the Conv was running on the CPU instead of the DPU.

image

Models

You can find the quantized models and other artifacts here: Yolov10s model

Note: I wanted to check only if the model runs on Ryzen and not worry about accuracy, hence the model is not quantized with accuracy in mind.

Expected Behavior

Any help or insights into resolving this issue would be greatly appreciated.

@uday610

uday610 commented 5 months ago

Hi @mht-sharma , looking into this

uday610 commented 5 months ago

Hi @mht-sharma , currently yolov10 model is not supported.

uday610 commented 5 months ago

closing it