openvinotoolkit / anomalib

An anomaly detection library comprising state-of-the-art algorithms and features such as experiment management, hyper-parameter optimization, and edge inference.
https://anomalib.readthedocs.io/en/latest/
Apache License 2.0
3.78k stars 672 forks source link

run a model (e.g. padim) with export to onnx enabled #502

Closed wangzuo-hx closed 2 years ago

wangzuo-hx commented 2 years ago

How do you convert the algorithm to onnx for model deployment?(e.g. Padim)

ashishbdatta commented 2 years ago

Since applying the OpenVINO optimization will generate an .onnx file you could this to your config:

optimization:
  openvino:
    apply: true

The ONNX file should be saved near your model weights folder.

I am creating a #509 PR to make this export more generic. You could also pull from top of main branch once it is merged.

djdameln commented 2 years ago

The most recent up-to-date instructions for exporting a trained model can be found on this page in our documentation. I'm closing this issue now, but please feel free to re-open if you have any additional questions.

Tekno-H commented 1 year ago

The most recent up-to-date instructions for exporting a trained model can be found on this page in our documentation. I'm closing this issue now, but please feel free to re-open if you have any additional questions.

Hello @djdameln , Great work you have done here. I am trying to export a trained CFLOW model to onnx format, I don't want to retrain the model. How can I achieve that ? The posted link is dead, and everything on the documentation suggests to re-train the model with the export flag enabled. Also I want to use a different back-end, MobileNetV3_large for example. What changes should I do? I have spent several days on this, and I believe you can help. Thanks in advance.