Closed phamquiluan closed 10 months ago
No, We haven't worked on that. But you can refer to this link for converting the pytorch model to onnx
@phamquiluan maybe this helps -- https://gist.github.com/pbamotra/d48b2940d84a214475dff9be6b5dab8c
Hey @pbamotra can you tell me about inference script? i have converted onnx from your gist.
@kbrajwani please refer to https://github.com/open-mmlab/mmdetection/blob/v2.10.0/tools/deployment/pytorch2onnx.py
I have already converted pytorch to onnx. But don't know how to inference from it.
Check out the verification part in that script -- https://github.com/open-mmlab/mmdetection/blob/bec24ea4ff55767256ac873c4b2e856d3f1dba1e/tools/deployment/pytorch2onnx.py#L72
Line 72 onwards where the authors compare the outputs of pytorch model and onnx model
Thanks i will look into it.
Hey i have converted model and testing it with api and threads. It's taking much memory at every inference time. Might be onnx model stores computation in cache. Has anyone encounter and solve the same problem?
Hi @kbrajwani Just wondering if you were successful in converting the model to onnx and using it in inference mode. I have run into multiple issues. I see many warnings but no errors but I cannot visualise the generated model. Are you please able to share some script/links for conversion and inference? thanks!
Bai!
do you have the script that convert you pytorch model to onnx