Open eladmaimoni opened 3 years ago
@elad8a for windows currently we offer WinML for inference path. In WinML you hand off the ONNX model to underlying libraries and don't really have control over the model optimizations, but you can choose where the model is run by using one of the options below
LearningModelDeviceKind::Default
LearningModelDeviceKind::Cpu,
LearningModelDeviceKind::DirectX,
LearningModelDeviceKind::DirectXHighPerformance,
LearningModelDeviceKind::DirectXMinPower
@kiritigowda like I said, I need OpenCL Interoperability. WinML / DirectML are based on DirectML which is based on DirectX 12.
AMD does not offer any interoperability between OpenCL & DirectX12 so I can't use this for my needs.
Is there a reason why MIOpen which is OpenCL based is not exposed to Windows Clients?
@eladmaimoni Apologies for the lack of response. Do you still need assistance with this ticket? Thanks!
I am very confused by the various libraries offered by AMD.
So basically: If I want to run inference on Windows with OpenCL Interoperability. What are my possibilities? what path should I take?
Thanks