onnx / tutorials

Tutorials for creating and using ONNX models
Apache License 2.0
3.34k stars 626 forks source link

How to register an onnx-graphsurgery(not-pytorch) custom operator to onnx? #241

Open dedoogong opened 3 years ago

dedoogong commented 3 years ago

Ask a Question

Question

I have converted TF model to onnx which contains some unsupported ops with a tool as below;

python3 -m tf2onnx.convert --graphdef frozen_model_v2 --output aslfeatv2.onnx

Then, as I need to run it on TensorRT, first I needed to check the converted onnx file are correct. So, I implemented those unsupported ops with C++/CUDA for onnx-runtime and built, tested. But in onnx, it still fails to recognize the custom onnx-runtime ops when running onnx-simplifier

....
....
onnx.checker.check_model(model)
  File "/home/lee/.local/lib/python3.6/site-packages/onnx/checker.py", line 102, in check_model
    C.check_model(protobuf_string)
onnx.onnx_cpp2py_export.checker.ValidationError: No Op registered for ASLFeatPluginX with domain_version of 13

I've read all tutorials related to the custom ops or pytorch custom ops but nothing can help me yet.

Can anybody tell me any hint for surviving this situation?

Thank you

onnx.onnx_cpp2py_export.checker.ValidationError: No Op registered for ASLFeatPluginX with domain_version of 13 """

Further information

Notes

Any additional information, code snippets.