microsoft / onnxruntime

ONNX Runtime: cross-platform, high performance ML inferencing and training accelerator
https://onnxruntime.ai
MIT License
14.79k stars 2.94k forks source link

[WinML] [C++/WinRT] Clarify how to share Ort::Env environments with WinRT/WinML instances #4971

Open wbudd opened 4 years ago

wbudd commented 4 years ago

This is actually a cross-post of an issue I posted over at WinML, but I thought it might be worth asking here too.

According to WinML documentation, the NuGet WinML solution provides "Direct access to the onnxruntime.dll".

However, all ONNXRuntime functionality requires creation of an Ort::Env environment instance as its first order of business, which apparently can only be created once per process.

A consequence of that seems to be that if I have WinML inference tasks running in thread A through the WinRT API; I am unable to, say, lookup ONNX input/output tensor names/dimension through the ONNXRuntime API in thread B—or vice versa— given that I'm neither able to reference WinML's internal Ort::Env (or can I?), nor can I pass my own Ort::Env instance to WinML constructors (or can I?). Attempting to use separate instances anyway results in the following error—either thrown by WinML or my side, depending simply on which side creates their instance first:

Only one instance of LoggingManager created with InstanceType::Default can exist at any point in time.

Is there anyway to share the same onnxruntime.dll with WinRT/WinML, for example by accessing the WinML backend through the ONNXRuntime API instead?

pranavsharma commented 4 years ago

There was a bug in OrtEnv creation due to which it was throwing the error you mentioned. It has been fixed and will be available as part of the upcoming 1.5 release.

wbudd commented 4 years ago

Thanks! Some clarification would be nice though. Does this mean that the statement "OrtEnv should be created only once, for each process" is inaccurate (or no longer accurate)? If so, does that mean that within the same process, an internal Ort::Env inside the WinML library can now live alongside another Ort::Env in an application which also calls Onnx Runtime directly?

pranavsharma commented 4 years ago

CreateEnv will always return the same instance of OrtEnv no matter how many times you call it. A bug prevented users from calling CreateEnv more than once; this bug has been fixed. Not sure which version of WinML you're using, so I can't comment on that, but if the version that calls CreateEnv is used, you should get the above mentioned behavior. cc @martinb35 to comment on the WinML api.

stale[bot] commented 4 years ago

This issue has been automatically marked as stale due to inactivity and will be closed in 7 days if no further activity occurs. If further support is needed, please provide an update and/or more details.

wbudd commented 4 years ago

cc @martinb35 to comment on the WinML api.

I wouldn't mind a comment here @martinb35 — before the stale bot sweeps yet another issue under the rug.

deischi commented 3 years ago

That CreateEnv should be used only once per Session is not very well documented (should I create a separate issue for that?)

From the interface it seems that you can have multiple environments. In most cases this probably does not matter, but if you e.g. would like to use logging functions, or configure a global thread pool you might notice only very late.

Updating the documentation (header file) would be helpful.

martinb35 commented 3 years ago

@wbudd - you can see the Windows ML call to ORT::CreateEnv here, which is the same way that @pranavsharma mentioned, so it should return the same instance. Having said that, our primary use cases are either all through the Windows ML interface or all through the OnnxRuntime interface, and we haven't designed/tested (yet) to handle the use case where you use both as your describing.

@pranavsharma - did you address the comment from @deischi about documenting how to use CreateEnv?

wbudd commented 3 years ago

@martinb35 Thanks for confirming that it should be the same instance.

In any case, I came to the conclusion that using the Microsoft.ML.OnnxRuntime.DirectML NuGet package instead of the WinML package makes a lot more sense when you also need ONNX Runtime functionality beyond what WinML offers; so that's what I recommend anyone who finds themselves in a similar situation.

smk2007 commented 3 years ago

@wbudd Yep, it makes more sense to use Microsoft.ML.ONNXRuntime.* packages when "makes a lot more sense when you also need ONNX Runtime functionality beyond what WinML offers"!

However, looking at your original question, it seemed like you wanted to "lookup ONNX input/output tensor names/dimension."

You should be able to look up input/output tensors with the LearningModel.Inputs and Outputs APIs in WinML. Was this not sufficient for you?