onnx / tutorials

Tutorials for creating and using ONNX models
Apache License 2.0
3.34k stars 627 forks source link

onnx.onnx_cpp2py_export.checker.ValidationError: convolution_W in initializer but not in graph input #214

Open wonderzy opened 4 years ago

wonderzy commented 4 years ago

convert from caffe model to tensorflow model tensorflow version 2.1.0 onnx version 1.7.0

Successfully convert from caffe to coreml to onnx using caffe2onnx When using onnx2tensorflow, i got the error:

Traceback (most recent call last):
  File "convert_caffe2onnx2tf.py", line 36, in <module>
    tf_exp = prepare(onnx_model)  # prepare tf representation
  File "/data6/TF_quant/onnx-tensorflow/onnx_tf/backend.py", line 62, in prepare
    super(TensorflowBackend, cls).prepare(model, device, **kwargs)
  File "/data5/workshop/anaconda3/lib/python3.6/site-packages/onnx/backend/base.py", line 74, in prepare
    onnx.checker.check_model(model)
  File "/data5/workshop/anaconda3/lib/python3.6/site-packages/onnx/checker.py", line 93, in check_model
    C.check_model(model.SerializeToString())
onnx.onnx_cpp2py_export.checker.ValidationError: convolution_W in initializer but not in graph input

Anything to resolve this?

vinitra-zz commented 4 years ago

This looks like an error with the caffe2onnx exporter. We recommend asking in that repo directly as those developers are most familiar with the codebase.

One way to confirm that your conversion error is due to caffe2onnx and not the onnx2tf conversion step is to run the onnx checker directly and see that it fails.

model = onnx.load('model.onnx')
onnx.checker.check_model(model)

As a hack / workaround, you can edit the graph inputs directly using the ONNX Python API and specifically updating the parts of the IR you want to change to include convolution_W as part of the graph inputs.

magehrke commented 3 years ago

I have the exact same issue and am pretty confident this error can be fixed by installing different versions of packages. The project I am working on uses mxnet and the onnx conversions are working fine on several other machines.

I think it has something to do with the version of protobuf (see this issue), but I could not figure out which version to use yet.

Just in case it helps anyone figure this out, here is my error:

Traceback (most recent call last):
  File "/Git/MultiAra/engine/src/rl/rl_training.py", line 138, in update_network
    [1, 8, 16], False)
  File "/Git/MultiAra/DeepCrazyhouse/src/domain/neural_net/onnx/convert_to_onnx.py", line 117, in convert_mxnet_model_to_onnx
    np.float32, onnx_file)
  File "/anaconda3/envs/ara/lib/python3.7/site-packages/mxnet/contrib/onnx/mx2onnx/export_model.py", line 83, in export_model
    verbose=verbose, opset_version=opset_version)
  File "/anaconda3/envs/ara/lib/python3.7/site-packages/mxnet/contrib/onnx/mx2onnx/export_onnx.py", line 338, in create_onnx_graph_proto
    checker.check_graph(graph)
  File "/anaconda3/envs/ara/lib/python3.7/site-packages/onnx/checker.py", line 51, in checker
    proto.SerializeToString(), ctx)
onnx.onnx_cpp2py_export.checker.ValidationError: bc_res_block0_se_fc0_0 in initializer but not in graph input