Closed wangzuo-hx closed 2 years ago
Since applying the OpenVINO optimization will generate an .onnx
file you could this to your config:
optimization:
openvino:
apply: true
The ONNX
file should be saved near your model weights folder.
I am creating a #509 PR to make this export more generic. You could also pull from top of main
branch once it is merged.
The most recent up-to-date instructions for exporting a trained model can be found on this page in our documentation. I'm closing this issue now, but please feel free to re-open if you have any additional questions.
The most recent up-to-date instructions for exporting a trained model can be found on this page in our documentation. I'm closing this issue now, but please feel free to re-open if you have any additional questions.
Hello @djdameln , Great work you have done here. I am trying to export a trained CFLOW model to onnx format, I don't want to retrain the model. How can I achieve that ? The posted link is dead, and everything on the documentation suggests to re-train the model with the export flag enabled. Also I want to use a different back-end, MobileNetV3_large for example. What changes should I do? I have spent several days on this, and I believe you can help. Thanks in advance.
How do you convert the algorithm to onnx for model deployment?(e.g. Padim)