Open xiabo123 opened 1 year ago
Not yet.
According to https://onnxruntime.ai/docs/execution-providers/ROCm-ExecutionProvider.html
If you are interested in onnxruntime's python API, you can add 'ROCmExecutionProvider' in onnxruntime/wrapper.py
import onnxruntime as ort
model_path = '<path to model>'
providers = [
'ROCmExecutionProvider',
'CPUExecutionProvider',
]
session = ort.InferenceSession(model_path, providers=providers)
@lvhan028 Please consult me. Compilation for the SDK. Like the mmcv library, rocm is supported. Do you have any suggestions for modifying it?
Motivation
The onnxruntime inference engine supports the rocm environment, but can mmdeploy also support it?
Related resources
No response
Additional context
No response