I am trying to run yolov7 on triton (not the entire deepstream). I have converted .pt -> .onnx -> .trt in yolov7. All these files work successfully during inference. But when I am trying to deploy weights.onnx on triton, it gives me error
triton-inference-server_1 | I1130 10:30:37.998450 1 onnxruntime.cc:2586] TRITONBACKEND_ModelFinalize: delete model state
triton-inference-server_1 | E1130 10:30:37.998479 1 model_lifecycle.cc:597] failed to load 'yolo_v7' version 1: Internal: onnx runtime error 1: Load model from /models/yolo_v7/1/model.onnx failed:Fatal error: TRT:EfficientNMS_TRT(-1) is not a registered function/op
I am trying to run yolov7 on triton (not the entire deepstream). I have converted .pt -> .onnx -> .trt in yolov7. All these files work successfully during inference. But when I am trying to deploy weights.onnx on triton, it gives me error
I have no idea what to do next.
This is how my config.pbtxt looks like