open-mmlab / mmdeploy

OpenMMLab Model Deployment Framework
https://mmdeploy.readthedocs.io/en/latest/
Apache License 2.0
2.77k stars 636 forks source link

[Feature] The onnxruntime inference engine supports the rocm environment, but can mmdeploy also support it? #1894

Open xiabo123 opened 1 year ago

xiabo123 commented 1 year ago

Motivation

The onnxruntime inference engine supports the rocm environment, but can mmdeploy also support it?

Related resources

No response

Additional context

No response

lvhan028 commented 1 year ago

Not yet. According to https://onnxruntime.ai/docs/execution-providers/ROCm-ExecutionProvider.html If you are interested in onnxruntime's python API, you can add 'ROCmExecutionProvider' in onnxruntime/wrapper.py

import onnxruntime as ort

model_path = '<path to model>'

providers = [
    'ROCmExecutionProvider',
    'CPUExecutionProvider',
]

session = ort.InferenceSession(model_path, providers=providers)
xiabo123 commented 1 year ago

@lvhan028 Please consult me. Compilation for the SDK. Like the mmcv library, rocm is supported. Do you have any suggestions for modifying it?