microsoft / onnxruntime

ONNX Runtime: cross-platform, high performance ML inferencing and training accelerator
https://onnxruntime.ai
MIT License
14.17k stars 2.86k forks source link

I can't use ROCMExecutionProvider #22171

Open baochangying opened 3 days ago

baochangying commented 3 days ago

Describe the bug

I can't use ROCMExecutionProvider

System information

OS Platform and Distribution: Arch Linux core 6.10.10

ONNX version: 1.19.2

Python version: 3.12

Reproduction instructions

providers = [("ROCMExecutionProvider", {"device_id": torch.cuda.current_device(), "user_compute_stream": str(torch.cuda.current_stream().cuda_stream)})] session_options = ort.SessionOptions() self.session = ort.InferenceSession(self.path, sess_options=session_options, providers=providers)

lib/python3.12/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:70: UserWarning: Specified provider 'ROCMExecutionProvider' is not in available provider names.Available providers: 'TensorrtExecutionProvider, CUDAExecutionProvider, CPUExecutionProvider'

form funcs: check_and_normalize_provider_args(),param: available_provider_names don't have ROCMExecutionProvider

tianleiwu commented 13 hours ago

You can install 1.18 here: https://repo.radeon.com/rocm/manylinux/rocm-rel-6.2/. The installation guide: https://rocm.docs.amd.com/projects/radeon/en/latest/docs/install/native_linux/install-onnx.html

If you want to use latest version, you will need build from source. See https://onnxruntime.ai/docs/execution-providers/ROCm-ExecutionProvider.html for detail.