intel / onnxruntime

ONNX Runtime: cross-platform, high performance scoring engine for ML models
MIT License
56 stars 22 forks source link

Enabled device_type AUTO:GPU.1,CPU in OVEP #389

Open ankitm3k opened 2 months ago

ankitm3k commented 2 months ago

Description

This PR adds new options to enable devices during inference runtime for machines having a iGPU (GPU.0) and dGPU (GPU.1)

Motivation and Context

We are now able to run inference with device_type as AUTO:GPU.1,CPU and further more combinations

sfatimar commented 1 month ago

Please close this PR if it is not rejected

sfatimar commented 1 month ago

Please rebase the branch .... the PR shows multiple commits which are not part of the fix

ankitm3k commented 1 month ago

Rebased the changes