Open limz230 opened 2 years ago
Hello,
Thanks for your release of DETR-like models! As I know, DETR can be converted to ONNX now, will you have plan to support these DETR-like models for ONNX? I am looking forward to your reply, thanks!
Thanks for reporting this question, we plan to support ONNX for a partial of DETR-like models in the future. Cuz some custom operators like MultiScaleDeformableAttention
may not be that easy to be exported into ONNX format. @limz230
Excuse me, when will you launch onnx?
Excuse me, when will you launch onnx?
We will start to explore detrex onnx export these days!Please stay tuned : )
@rentainhe Any updates on this? I've successfully exported MaskDINO to ONNX, but something is broken, the model is way worse and smaller. Still analyzing...
@rentainhe Any updates on this? I've successfully exported MaskDINO to ONNX, but something is broken, the model is way worse and smaller. Still analyzing...
@powermano has provided a detailed usage in this issue https://github.com/IDEA-Research/detrex/issues/192, maybe you can refer to this for more details @ichitaka , we did not have time to export maskdino to onnx these days but we will try to figure it out in the future, very sorry.
I'm also very interested in MaskDINO to ONNX, have you used @powermano's script? @ichitaka It would be great if you could provide the script for MaskDINO as well! maybe we can fix the errors together?
@alrightkami hi, Do you have the script export MaskDINO into ONNX,thanks
https://github.com/jozhang97/DETA/pull/24
I have tried converting DETA Pytorch -> ONNX into TensorRT inference. There is a workaround solution for MSMHDA.
Anyone had success with exporting Mask Dino to ONNX or any other format for deployment?
@SergiyShebotnov I successfully deployed it using TorchTensorRT. Had to add the custom CUDA kernels by rewriting them a bit. It was a bit of work. Sadly can't share it, since my old customer owns that. Look at the custom modules sections in the docs. If you know C++, you should be fine.
@SergiyShebotnov I successfully deployed it using TorchTensorRT. Had to add the custom CUDA kernels by rewriting them a bit. It was a bit of work. Sadly can't share it, since my old customer owns that. Look at the custom modules sections in the docs. If you know C++, you should be fine.
Can you share little bit of detail which part should be converted to C++? (MSHDA perhaps?)
Yeah it's the only module needed.
From: Sangbum Daniel Choi @.> Sent: Sunday, December 3, 2023 11:52:24 AM To: IDEA-Research/detrex @.> Cc: ichitaka @.>; Mention @.> Subject: Re: [IDEA-Research/detrex] Plan to support for ONNX (Issue #83)
@SergiyShebotnovhttps://github.com/SergiyShebotnov I successfully deployed it using TorchTensorRT. Had to add the custom CUDA kernels by rewriting them a bit. It was a bit of work. Sadly can't share it, since my old customer owns that. Look at the custom modules sections in the docs. If you know C++, you should be fine.
Can you share little bit of detail which part should be converted to C++? (MSHDA perhaps?)
— Reply to this email directly, view it on GitHubhttps://github.com/IDEA-Research/detrex/issues/83#issuecomment-1837457667, or unsubscribehttps://github.com/notifications/unsubscribe-auth/ACQG7XC64QSVY5EKAKK5QBDYHRRXRAVCNFSM6AAAAAAQXN2VUWVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTQMZXGQ2TONRWG4. You are receiving this because you were mentioned.Message ID: @.***>
Hello,
Thanks for your release of DETR-like models! As I know, DETR can be converted to ONNX now, will you have plan to support these DETR-like models for ONNX? I am looking forward to your reply, thanks!