dkurt / openvino_pytorch_layers

How to export PyTorch models with unsupported layers to ONNX and then to Intel OpenVINO
https://github.com/openvinotoolkit/openvino
Apache License 2.0
26 stars 13 forks source link

Question about Standalone Operation #38

Closed AyanKumarBhunia closed 1 year ago

AyanKumarBhunia commented 2 years ago

Once I convert the PyTorch model to ONNX -- Can I run the converted model (involving grid_sample) standalone without any dependency on this openvino_pytorch_layers package?

dkurt commented 2 years ago

Hi! Currently, this repo provides instructions which requires use a Python package:

from openvino_extensions import get_extensions_path
from openvino.inference_engine import IECore

ie = IECore()
ie.add_extension(get_extensions_path(), 'CPU')

which can be installed by pip: pip install openvino-extensions

If you wanted use it in C++ - you need compile extensions .so library or embed extensions code in runtime.

Especially about grid_sample, probably there is an option to convert models with this operation using existing OpenVINO ops but I have not tried.

dkurt commented 2 years ago

@AyanKumarBhunia, which layer is interesting for you?