/opt/conda/lib/python3.7/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:56: UserWarning: Specified provider 'CUDAExecutionProvider' is not in available provider names.Available providers: 'CPUExecutionProvider'
"Available providers: '{}'".format(name, ", ".join(available_provider_names))
Traceback (most recent call last):
File "projects/easydeploy/tools/image-demo.py", line 146, in <module>
main()
File "projects/easydeploy/tools/image-demo.py", line 67, in main
model = ORTWrapper(args.checkpoint, args.device)
File "/mmyolo/projects/easydeploy/model/backendwrapper.py", line 154, in __init__
self.__init_session()
File "/mmyolo/projects/easydeploy/model/backendwrapper.py", line 163, in __init_session
str(self.weight), providers=providers)
File "/opt/conda/lib/python3.7/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 360, in __init__
self._create_inference_session(providers, provider_options, disabled_optimizers)
File "/opt/conda/lib/python3.7/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 397, in _create_inference_session
sess = C.InferenceSession(session_options, self._model_path, True, self._read_config_from_model)
onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Load model from weights/coco01-960x960/end2end.onnx failed:Fatal error: TRT:EfficientNMS_TRT(-1) is not a registered function/op
Prerequisite
🐞 Describe the bug
config
export command :
demo command :
error :
Environment
Additional information
No response