open-mmlab / mmdetection3d

OpenMMLab's next-generation platform for general 3D object detection.
https://mmdetection3d.readthedocs.io/en/latest/
Apache License 2.0
5.15k stars 1.52k forks source link

Need help with converting custom model .pth -> .onnx #2803

Open Joonki0111 opened 10 months ago

Joonki0111 commented 10 months ago

Branch

main branch https://mmdetection3d.readthedocs.io/en/latest/

📚 The doc issue

I'm trying to make my own custom model for lidar_centerpoint in perception module. And I found out that there was a guideline provided by Autoware which leads me to this link from the official document https://github.com/open-mmlab/mmdetection3d.

So I followed the mmdetection3d and the result came out with a single model with .pth format.

I needed to convert the model to .onnx since Autoware.Universe uses .onnx format and i used mmdeploy that converts .pth to .onnx in mmdetection3d official repo.

But the result i got was a model combined all 4(backbone, neck, head, encoder) in 1 called end2end.onnx. And end2end.onnx model obviously, didn't work with the lidar_centerpoint.

As a conclusion, I want my custom model to be like the one used by Autoware.Universe such as the model below.

Centerpoint : pts_voxel_encoder_centerpoint.onnx, pts_backbone_neck_head_centerpoint.onnx Centerpoint tiny: pts_voxel_encoder_centerpoint_tiny.onnx, pts_backbone_neck_head_centerpoint_tiny.onnx

If anyone who tried to use a custom model for lidar_centerpoint, please provide some help for me and my team for my project.

Suggest a potential alternative/fix

No response

MechanoPixel-Work commented 3 months ago

I am facing the same issue. Have you been able to find a way to convert the .pth file to the required .onnx files?

Joonki0111 commented 3 months ago

Hi! glad to hear that you are working on the same project I have worked on. Unfortunately, me and my team has failed the project(we didn't have much time allowed). But currently we have started a new project which needs the same converting process .pth -> .onnx. I assume that there are very few people whose working on this so if you don't mind, we can keep in touch to resolve this issue together. Provide more detail about your work for any help.