microsoft / onnxruntime

ONNX Runtime: cross-platform, high performance ML inferencing and training accelerator
https://onnxruntime.ai
MIT License
14.55k stars 2.91k forks source link

Expose DirectML to C# #4941

Closed jbrownkramer closed 4 years ago

jbrownkramer commented 4 years ago

Is your feature request related to a problem? Please describe. I need to be able to run inference on an integrated Intel GPU from C#

System information Microsoft.ML.OnnxRuntime.DirectML v1.4.0 nuget package.

Describe the solution you'd like There seems to be no ability to select the DirectML provider from the C# api, even when using the DirectML build. I would like that to be exposed.

Describe alternatives you've considered I have considered using the OpenVINO provider, which apparently recently added C# support. However, there is no nuget package, and I have not yet been able to make it through the installation and build process to see if that works.

dashesy commented 4 years ago

issue #4876 related

hariharans29 commented 4 years ago

issue #4876 related

@dashesy - Not sure if these are completely related. The issue here is that there is a Nuget supported for DML (Microsoft.ML.OnnxRuntime.DirectML) and the C# API lacks support for creating sessions that can leverage the binary shipped in the Nuget. #4876 seems to be a request for a Nuget for each EP supported in ORT.

Alvaromah commented 4 years ago

Trying to use the DML provider in C# with options.AppendExecutionProvider_Dml() gives the following error: System.EntryPointNotFoundException: Unable to find an entry point named 'OrtSessionOptionsAppendExecutionProvider_Dml' in DLL 'onnxruntime'. What am I missing?

hariharans29 commented 4 years ago

You need to build onnxruntime with dml enabled (—use_dml). Please refer to the steps in Build.md.

Alvaromah commented 4 years ago

@hariharans29 - Thanks for your quick response but I did build onnxruntime with --use_dml option but still having the same error. Steps taken:

  1. build.bat --use_dml
  2. Open and build /onnxruntime/csharp/OnnxRuntime.CSharp.sln
  3. Open and execute sample Microsoft.ML.OnnxRuntime.InferenceSample with AppendExecutionProvider_Dml() option
  4. System.EntryPointNotFoundException
Alvaromah commented 4 years ago

After some research, it seems to be a problem with the name of the exported function. The name of the function exported by the onnxruntime.dll is OrtSessionOptionsAppendExecutionProvider_DML() but the CSharp code uses: OrtSessionOptionsAppendExecutionProvider_dml (dml in lower case).

After fixing the typo, I get the following error: Exception(1) tid(4990) 80004005 Unspecified error [onnxruntime::CreateExecutionProviderFactory_DML(D3D12CreateDevice(adapter.Get(), D3D_FEATURE_LEVEL_11_0, IID_PPV_ARGS(&d3d12_device)))]

hariharans29 commented 4 years ago

Oops you're right, thank you. I ll fix the typo.

@fdwr - Any idea why the DML EP doesn't get initialized properly ?

Alvaromah commented 4 years ago

Its working now. It was probably a problem with my device adapter. Thanks!

hariharans29 commented 4 years ago

Great thanks. Did you pass in a device id of 0 ?

hariharans29 commented 4 years ago

Actually - I get the same exception when I try on my machine. Did you try anything to solve the problem with the device adapter ?

Alvaromah commented 4 years ago

I'm currently using the onnxruntime.dll from package Microsoft.ML.OnnxRuntime.DirectML and it works for me. I'm passing device id 0.

HydeNaut commented 4 years ago

I've tried using the Microsoft.ML.OnnxRuntime.DirectML 1.4.0 Nuget and I don't see the DML option available. Should it be be there at the moment or would it be in the next update?

hariharans29 commented 4 years ago

It was added after 1.4.0 released. So it should be in the upcoming release or you can build from source following the step listed in the discussion above.

elephantpanda commented 1 year ago

Hi I'm also using the onnxruntime.dll from package Microsoft.ML.OnnxRuntime.DirectML

and have set:

SessionOptions so = new SessionOptions {
        };
        so.AppendExecutionProvider_DML();
session = new InferenceSession(filePath, so)

but I'm getting:

onnxruntime.dll!00007FFF23355183: (caller: 00007FFF233545CF) Exception(1) tid(2fc4) 80004002 No such interface supported

Any ideas? I have intel UHD graphics, DirectX 12 supported and 64bit OS.

fdwr commented 1 year ago

@pauldog Can you confirm in the debugger (e.g. Visual Studio's native modules window) that you have the redist version of DirectML.dll being loaded in the same path as your executable and/or onnxruntime.dll (and not the older windows\system32\directml.dll version)?

elephantpanda commented 1 year ago

@fdwr Hello, I solved this problem. I was using Unity and I just had to put the DirectML.dll in the assets folder and set it to "load at start". Now DirectML works in the editor. Also when I build it, I just put the DirectML.dll in the build folder. So it all works nicely.

I just got a Shadow PC which has the equivalent of a NVidia 1080 GTX, so now I can run it in the cloud and it's running very fast. So that's nice. 🙂

Unity has it's own "Barracuda" package to run onnx models, but unfortunately it can't run many onnx's.