microsoft / onnxruntime-inference-examples

Examples for using ONNX Runtime for machine learning inferencing.
MIT License
1.21k stars 337 forks source link

On SNPE example on ARM64 Windows device, snpe-onnx-to-dlc is for x64 only #197

Open thnkim opened 1 year ago

thnkim commented 1 year ago

I'm using ARM64 Windows (Dev Kit 2023).

According to https://github.com/microsoft/onnxruntime-inference-examples/tree/main/c_sharp/Snpe_EP/vgg16_image_classification, the instruction is to install 'SNPE SDK for Linux' on WSL2. But, snpe-onnx-to-dlc is for x64 only and thus not executable on WSL2, thus it fails at the step below.

Follow steps 3 through 5 in [Qualcomm's tutorial](https://developer.qualcomm.com/sites/default/files/docs/snpe/tutorial_onnx.html)
These will generate a vgg16.dlc file in $SNPE_ROOT/models/VGG/dlc/

It worked on Linux x64 system. Is there a way to make WSL run the snpe-onnx-to-dlc like x64 emulation?

Thank you.

sang981113 commented 1 year ago

hello, I am trying same example code on Surface Pro 9 that same processor, GPU with Dev Kit 2023. But I failed run the image_classification. So I curious that you have done this example with Dev Kit 2023. please just tell me I could have done or not. ㅠㅠ

jywu-msft commented 1 year ago

yes, i think you need to do the conversion on x64 host as the tools aren't available on Windows/ARM