microsoft / onnxruntime

ONNX Runtime: cross-platform, high performance ML inferencing and training accelerator
https://onnxruntime.ai
MIT License
14.84k stars 2.94k forks source link

[Build] How to build onnxruntime with openvino statically? #18950

Open xeeetu opened 11 months ago

xeeetu commented 11 months ago

Describe the issue

I managed to build onnxruntime with directml statically using the command: build.bat --config Release --parallel --skip_tests --enable_msvc_static_runtime --use_dml --cmake_generator "Visual Studio 17 2022", I looked at the onnx_test_runner project settings and customized my project. After that I decided to go further and build onnxruntime using openvino: --use_openvino AUTO:GPU,CPU The library build was successful. But when calling the _OrtSessionOptions.AppendExecutionProvider_OpenVINO(options); function, an exception occurs: Unhandled exception at 0x000000007FFA314CCF19 in "project name": Microsoft C++ exception: Ort::Exception at memory location 0x0000000094FEEFE8C0. When customizing this project, I thought of nothing better than to add all libraries with "openvino" in the name to linker->input(onnxruntime_providers_openvino.lib, custom_op_openvino_wrapper_library.lib). I don't know how to do this correctly, as I haven't found a test project similar to onnx_test_runner where openvino is linked statically. I would be glad to have any answer.

Urgency

As soon as possible

Target platform

Windows 10

Build script

build.bat --config Release --parallel --skip_tests --enable_msvc_static_runtime --use_dml --cmake_generator "Visual Studio 17 2022" --use_openvino AUTO:GPU,CPU

Error / output

Unhandled exception at 0x000000007FFA314CCF19 in "project name": Microsoft C++ exception: Ort::Exception at memory location 0x0000000094FEEFE8C0

Visual Studio Version

2022

GCC / Compiler Version

No response

xeeetu commented 11 months ago

But when I open the list of available providers, openvino is in the list:
std::vector providers = Ort::GetAvailableProviders(); std::cout << "Providers onnxruntime:" << std::endl; for (const auto& provider : providers) { std::cout << provider << std::endl; } Output: Providers onnxruntime: OpenVINOExecutionProvider DmlExecutionProvider CPUExecutionProvider

ilya-lavrenov commented 8 months ago

Have you built OpenVINO statically?