Open RGdevz opened 1 month ago
Do you mean Intel's OpenVINO? Unfortunately there is no web support there.
Do you mean Intel's OpenVINO? Unfortunately there is no web support there.
Yes, I meant for non web inference like in node js
I think the current onnxruntime-node only supports CPU, DirectML and CUDA (https://onnxruntime.ai/docs/get-started/with-javascript/node.html).
Feature request
is it possible to add openvino as backend?
Motivation
it is faster then onnxruntime
Your contribution