intel / onnxruntime

ONNX Runtime: cross-platform, high performance scoring engine for ML models
MIT License
56 stars 22 forks source link

Update: Enabled device_type AUTO:GPU.1,CPU in OVEP #399

Closed ankitm3k closed 1 month ago

ankitm3k commented 1 month ago

Description

This PR adds new options to enable devices during inference runtime for machines having a iGPU (GPU.0) and dGPU (GPU.1)

Motivation and Context

We are now able to run inference with device_type as AUTO:GPU.1,CPU and further more combinations

sfatimar commented 1 month ago

LGTM