PUTvision / utterances

:crystal_ball: A lightweight comments widget built on GitHub issues
https://utteranc.es
MIT License
0 stars 0 forks source link

article/raspberry-onnxruntime-openvino/ #2

Open utterances-bot opened 8 months ago

utterances-bot commented 8 months ago

ONNXRuntime inference works well on Raspberry Pi 4 with Intel NCS2: step by step setup with OpenVINO Execution Provider - PUT Vision Lab

https://putvision.github.io/article/raspberry-onnxruntime-openvino/

DanielMontoyaS commented 8 months ago

Hi. Could you help me with something? What was your host machine to create the .whl file? I would like to create the file for python 3.10 but I have had problems following the steps on the onnxruntime page.