onnx / tutorials

Tutorials for creating and using ONNX models
Apache License 2.0
3.39k stars 629 forks source link

ValidationError while trying to run OnnxTensorflowImport tutorial #205

Closed rallen10 closed 4 years ago

rallen10 commented 4 years ago

Edit: including full traceback

I am learning how to load an ONNX model and perform inference with TensorFlow. The most relevant tutorial seems to be: https://github.com/onnx/tutorials/blob/master/tutorials/OnnxTensorflowImport.ipynb

However, when trying to run the following code block:

import onnx
import warnings
from onnx_tf.backend import prepare

warnings.filterwarnings('ignore') # Ignore all the warning messages in this tutorial
model = onnx.load('assets/super_resolution.onnx') # Load the ONNX file
tf_rep = prepare(model) # Import the ONNX model to Tensorflow

I get the following error:

---------------------------------------------------------------------------
ValidationError                           Traceback (most recent call last)
<ipython-input-1-2aa043b56652> in <module>
      5 warnings.filterwarnings('ignore') # Ignore all the warning messages in this tutorial
      6 model = onnx.load('assets/super_resolution.onnx') # Load the ONNX file
----> 7 tf_rep = prepare(model) # Import the ONNX model to Tensorflow

~/Projects/Sandbox/onnx/onnx-tensorflow/onnx_tf/backend.py in prepare(cls, model, device, strict, logging_level, **kwargs)
     60     :returns: A TensorflowRep class object representing the ONNX model
     61     """
---> 62     super(TensorflowBackend, cls).prepare(model, device, **kwargs)
     63     common.logger.setLevel(logging_level)
     64     common.logger.handlers[0].setLevel(logging_level)

~/miniconda3/envs/onnx_sandbox/lib/python3.8/site-packages/onnx/backend/base.py in prepare(cls, model, device, **kwargs)
     72                 ):  # type: (...) -> Optional[BackendRep]
     73         # TODO Remove Optional from return type
---> 74         onnx.checker.check_model(model)
     75         return None
     76 

~/miniconda3/envs/onnx_sandbox/lib/python3.8/site-packages/onnx/checker.py in check_model(model, full_check)
     91         m = onnx.load(model)
     92     else:
---> 93         C.check_model(model.SerializeToString())
     94         m = model
     95     if full_check:

ValidationError: Mismatched attribute type in ' : kernel_shape'

==> Context: Bad node spec: input: "1" input: "2" output: "11" op_type: "Conv" attribute { name: "kernel_shape" ints: 5 ints: 5 } attribute { name: "strides" ints: 1 ints: 1 } attribute { name: "pads" ints: 2 ints: 2 ints: 2 ints: 2 } attribute { name: "dilations" ints: 1 ints: 1 } attribute { name: "group" i: 1 }

Any thoughts to what is causing the problem? Perhaps I am using an incompatible version of ONNX, but how would I know what is the correct version?

For reference, I followed the Step 1: Installation instructions by installing ONNX, TensorFlow, and onnx-tensorflow in a new conda environment. The only addition I made was to install jupyterlab and tensorflow-addons due to an error I received (see issue #201). See attached conda environment for details

rallen10 commented 4 years ago

Follow-up: I switched to onnx=1.3.0 and it seemed to fix this issue (although I got a lot of warnings instead of errors)