microsoft / onnxruntime

ONNX Runtime: cross-platform, high performance ML inferencing and training accelerator
https://onnxruntime.ai
MIT License
14.12k stars 2.84k forks source link

Microsoft.ML.OnnxRuntime.OpenVino Encountered unknown exception in Initialize #18152

Open taotaow opened 10 months ago

taotaow commented 10 months ago

Describe the issue

when use Microsoft.ML.OnnxRuntime.OpenVino run unet Inference, Encountered unknown exception in Initialize()

To reproduce

1.use https://github.com/cassiebreviu/StableDiffusion 2.use stablediffusion1.5 onnx model, https://huggingface.co/runwayml/stable-diffusion-v1-5/tree/onnx 3.git clone microsoft/onnxruntime,build nuget,got Microsoft.ML.OnnxRuntime.OpenVino and Microsoft.ML.OnnxRuntime.Managed

build.bat --config RelWithDebInfo --use_openvino GPU_FP32 --build_shared_lib --build_nuget

4.register openvino execution provider

sessionOptions.AppendExecutionProvider_OpenVINO("GPU_FP32"); sessionOptions.AppendExecutionProvider_CPU()

5.when new unet InferenceSession Encountered unknown exception

2023-10-30 14:10:36.8313117 [I:onnxruntime:, inference_session.cc:328 onnxruntime::InferenceSession::ConstructorCommon::::operator ()] Flush-to-zero and denormal-as-zero are off 2023-10-30 14:10:36.9234057 [I:onnxruntime:, inference_session.cc:336 onnxruntime::InferenceSession::ConstructorCommon] Creating and using per session threadpools since use_per_sessionthreads is true 2023-10-30 14:10:36.9294581 [I:onnxruntime:, inference_session.cc:354 onnxruntime::InferenceSession::ConstructorCommon] Dynamic block base set to 0 2023-10-30 14:10:44.6156866 [I:onnxruntime:, inference_session.cc:1400 onnxruntime::InferenceSession::Initialize] Initializing session. 2023-10-30 14:10:46.6811375 [E:onnxruntime:, inference_session.cc:1790 onnxruntime::InferenceSession::Initialize] Encountered unknown exception in Initialize()

6.problem maybe in code below:

// apply any transformations to the main graph and any subgraphs ORT_RETURN_IF_ERRORSESSIONID(TransformGraph(graph, saving_ort_format));

Urgency

can't find avaliable Microsoft.ML.OnnxRuntime.OpenVino nuget dlls,can't use c# do stable diffusion Inference

Platform

Windows

OS Version

Win11

ONNX Runtime Installation

Built from Source

ONNX Runtime Version or Commit ID

1.16.1 or 1.17

ONNX Runtime API

C#

Architecture

X64

Execution Provider

OpenVINO

Execution Provider Library Version

No response

kimi0230 commented 10 months ago

Hi @taotaow you can try my PR: https://github.com/microsoft/onnxruntime/pull/18075

Modify this file tools/nuget/generate_nuspec_for_native_nuget.py

截圖 2023-10-30 下午6 15 24

taotaow commented 10 months ago

thank you, I use openvino 2023.1 too,but I had modifyed that file,otherwise I can't generate Microsoft.ML.OnnxRuntime.OpenVino.dll I add logs in inference_session.cc file,exception occurs in ORT_RETURN_IF_ERRORSESSIONID(TransformGraph(graph, saving_ort_format));

kimi0230 commented 10 months ago

Sorry, I didn't notice you were using Stable Diffusion as well. I encountered the same issue as you did. ONNX Runtime Version: 1.16.0 and 1.17.

I had tried Object detection with YOLOv3 in C# using OpenVINO Execution Provider before, and it was working.

lifang-zhang commented 9 months ago

I encountered the same issue [E:onnxruntime:, inference_session.cc:1803 Initialize] Encountered unknown exception in Initialize() with model candy.onnx using onnxruntime v1.16.3 and OpenVINO 2023.

OpenVINO 2022 encounterred the same issue with following error:

[OpenVINO-EP]  Exception while Loading Network for graph: OpenVINOExecutionProvider_OpenVINO-EP-subgraph_2_1[ UNEXPECTED ] CPU plug-in doesn't support Parameter operation with dynamic rank. Operation name: SpatialZeroPadding_21
decadance-dance commented 5 months ago

I've faced the same error when using a onnx + openvino backend in the triton.