microsoft / onnxruntime

ONNX Runtime: cross-platform, high performance ML inferencing and training accelerator
https://onnxruntime.ai
MIT License
14.18k stars 2.86k forks source link

[Documentation] Execution provider strings #22101

Open thewh1teagle opened 1 week ago

thewh1teagle commented 1 week ago

Describe the documentation issue

It's not clear what should be the execution providers strings in Python. Eg. I want to enable DirectML or CoreML. I can see them here https://onnxruntime.ai/docs/execution-providers/ But I don't see any mention for the exact string value that should passed eg CUDAExecutionProvider

Page / URL

https://onnxruntime.ai/docs/api/python/api_summary.html#load-and-run-a-model

fdwr commented 1 week ago

https://github.com/microsoft/onnxruntime/blob/main/include/onnxruntime/core/graph/constants.h#L32-L55

constexpr const char* kCpuExecutionProvider = "CPUExecutionProvider";
constexpr const char* kCudaExecutionProvider = "CUDAExecutionProvider";
constexpr const char* kCudaNHWCExecutionProvider = "CUDANHWCExecutionProvider";
constexpr const char* kDnnlExecutionProvider = "DnnlExecutionProvider";
constexpr const char* kOpenVINOExecutionProvider = "OpenVINOExecutionProvider";
constexpr const char* kVitisAIExecutionProvider = "VitisAIExecutionProvider";
constexpr const char* kTensorrtExecutionProvider = "TensorrtExecutionProvider";
constexpr const char* kNnapiExecutionProvider = "NnapiExecutionProvider";
constexpr const char* kQnnExecutionProvider = "QNNExecutionProvider";
constexpr const char* kRknpuExecutionProvider = "RknpuExecutionProvider";
constexpr const char* kDmlExecutionProvider = "DmlExecutionProvider";
constexpr const char* kMIGraphXExecutionProvider = "MIGraphXExecutionProvider";
constexpr const char* kAclExecutionProvider = "ACLExecutionProvider";
constexpr const char* kArmNNExecutionProvider = "ArmNNExecutionProvider";
constexpr const char* kRocmExecutionProvider = "ROCMExecutionProvider";
constexpr const char* kCoreMLExecutionProvider = "CoreMLExecutionProvider";
constexpr const char* kJsExecutionProvider = "JsExecutionProvider";
constexpr const char* kSnpeExecutionProvider = "SNPEExecutionProvider";
constexpr const char* kTvmExecutionProvider = "TvmExecutionProvider";
constexpr const char* kXnnpackExecutionProvider = "XnnpackExecutionProvider";
constexpr const char* kWebNNExecutionProvider = "WebNNExecutionProvider";
constexpr const char* kCannExecutionProvider = "CANNExecutionProvider";
constexpr const char* kAzureExecutionProvider = "AzureExecutionProvider";
constexpr const char* kVSINPUExecutionProvider = "VSINPUExecutionProvider";
thewh1teagle commented 1 week ago

https://github.com/microsoft/onnxruntime/blob/main/include/onnxruntime/core/graph/constants.h#L32-L55

Can you add this link to the Python docs? Thanks :)

fdwr commented 1 week ago

Can you add this link to the Python docs?

I'm not sure who is in charge of Python documentation, but maybe @natke would know, or maybe @scottmckay would know from the CoreML angle.

jywu-msft commented 1 week ago

https://onnxruntime.ai/docs/api/python/api_summary.html#onnxruntime.get_all_providers

>>> import onnxruntime
>>> onnxruntime.get_all_providers()
['TensorrtExecutionProvider', 'CUDAExecutionProvider', 'MIGraphXExecutionProvider', 'ROCMExecutionProvider', 'OpenVINOExecutionProvider', 'DnnlExecutionProvider', 'TvmExecutionProvider', 'VitisAIExecutionProvider', 'QNNExecutionProvider', 'NnapiExecutionProvider', 'VSINPUExecutionProvider', 'JsExecutionProvider', 'CoreMLExecutionProvider', 'ArmNNExecutionProvider', 'ACLExecutionProvider', 'DmlExecutionProvider', 'RknpuExecutionProvider', 'WebNNExecutionProvider', 'XnnpackExecutionProvider', 'CANNExecutionProvider', 'AzureExecutionProvider', 'CPUExecutionProvider']

https://onnxruntime.ai/docs/api/python/api_summary.html#onnxruntime.get_available_providers

>>> import onnxruntime
>>> onnxruntime.get_available_providers()
['DmlExecutionProvider', 'CPUExecutionProvider']