zhiqwang / yolort

yolort is a runtime stack for yolov5 on specialized accelerators such as tensorrt, libtorch, onnxruntime, tvm and ncnn.
https://zhiqwang.com/yolort
GNU General Public License v3.0
717 stars 153 forks source link

Add qualcomm SNPE compatible ONNX #248

Open hansoullee20 opened 2 years ago

hansoullee20 commented 2 years ago

🚀 The feature

Hello again

do you offer onnx model that can be simplified?

Motivation, pitch

I am trying to use yolov5 on qualcomm snpe(https://developer.qualcomm.com/sites/default/files/docs/snpe/overview.html) and would need the onnx model to be compatible.

Alternatives

No response

Additional context

No response

zhiqwang commented 2 years ago

Hi @hansoullee20

Sorry, we do not currently offer an ONNX model that is compatible with Snapdragon Neural Processing Engine.

hansoullee20 commented 2 years ago

@zhiqwang thank you for the reply. is there still any ways to simplify the onnx model using onnxsim?

zhiqwang commented 2 years ago

Hi @hansoullee20 ,

Because we currently placed both pre-processing (interpolation operator) and post-processing (nms) into the ONNX graph. I did a quick check of SNPE's documentation and seems that they don't support these ops very well.

I guess we can use the export.py in ultralytics/yolov5 with

python export.py --weights path/to/your/model.pt --include onnx --simplify --train

to get a SNPE compatible ONNX model.

hansoullee20 commented 2 years ago

@zhiqwang Thank you for your kind response.