Open utterances-bot opened 8 months ago
https://putvision.github.io/article/raspberry-onnxruntime-openvino/
Hi. Could you help me with something? What was your host machine to create the .whl file? I would like to create the file for python 3.10 but I have had problems following the steps on the onnxruntime page.
ONNXRuntime inference works well on Raspberry Pi 4 with Intel NCS2: step by step setup with OpenVINO Execution Provider - PUT Vision Lab
https://putvision.github.io/article/raspberry-onnxruntime-openvino/