Open OutSorcerer opened 1 year ago
The problem are the grouped convolutions
It gets exported without errors at all when using standard convolutions (groups=1
).
Related to https://github.com/onnx/tensorflow-onnx/issues/2099
Any idea? workaround?
I also have the issue :(
Downgrading to 2.8.4 is not nice. Any solutions found already?
TFSegformerForSemanticSegmentation
fromhuggingface/transformers
export to ONNX used to work in TF 2.8.4, the notebook that reproduces a successful export: https://colab.research.google.com/gist/OutSorcerer/c8cd27a455091b57d9ea90ab3450035e/tfsegformer_onnx.ipynbbut its export fails in TF >= 2.9.0, a notebook that reproduces the problem is here: https://colab.research.google.com/gist/OutSorcerer/ebc93cd734ecc0e1dee96c8d20e5e9d5/tfsegformer_onnx.ipynb
The error message is
However, an ONNX file is still produced, which fails at inference time.
When a model is printed there is a node
The issue is that node
__inference__jit_compiled_convolution_op_6171
is referenced, but its definition is nowhere to be found, so it is likely tf2onnx has failed to convert__inference__jit_compiled_convolution_op_6171
at the first place.