Closed rtaylo45 closed 2 years ago
A pipeline include data preprocess, model inference and output data postprocess. The onnx/tensorrt model only include the model inference part.
MMDeploy hava already implemented some common preprocess/postprocess. To use there process, you need use mmdeploy sdk c/python api to do the inference.
Generally speaking, you have to do three more things. The answer of this issus describles the work you have to do. For c api demo, you can refer to this. For python api. you can refer to this
Hello,
I have converted a MMPose model into onnx with the following command:
and i inference the model using onnxruntime. The resulting output is a heatmap with shape (1, 17, 64, 48), is there a special config that i need to add to have the onnx model include the heatmap conversion to the keypoints? If not is there a plugin in MMDeploy that i can use to easily post process the heatmaps to the keypoints?
Thanks, Zack