onnx / onnx-tensorflow

Tensorflow Backend for ONNX
Other
1.27k stars 297 forks source link

onnx.onnx_cpp2py_export.checker.ValidationError: convolution_W in initializer but not in graph input #740

Open wonderzy opened 4 years ago

wonderzy commented 4 years ago

Describe the bug

convert from caffe model to tensorflow model

To Reproduce

Successfully convert from caffe to coreml to onnx using caffe2onnx When using onnx2tensorflow, i got the error:

Traceback (most recent call last):
  File "convert_caffe2onnx2tf.py", line 36, in <module>
    tf_exp = prepare(onnx_model)  # prepare tf representation
  File "/data6/TF_quant/onnx-tensorflow/onnx_tf/backend.py", line 62, in prepare
    super(TensorflowBackend, cls).prepare(model, device, **kwargs)
  File "/data5/workshop/anaconda3/lib/python3.6/site-packages/onnx/backend/base.py", line 74, in prepare
    onnx.checker.check_model(model)
  File "/data5/workshop/anaconda3/lib/python3.6/site-packages/onnx/checker.py", line 93, in check_model
    C.check_model(model.SerializeToString())
onnx.onnx_cpp2py_export.checker.ValidationError: convolution_W in initializer but not in graph input

My python code is:

import coremltools
import onnxmltools
import onnx
import warnings
from onnx_tf.backend import prepare

# Update your input name and path for your caffe model
proto_file = './mobilenet_v2_nobn.prototxt' 
input_caffe_path = './mobilenet_v2_nobn.caffemodel'

# Update the output name and path for intermediate coreml model, or leave as is
output_coreml_model = './mobilenet_v2_nobn.mlmodel'

# Change this path to the output name and path for the onnx model
output_onnx_model = './mobilenet_v2_nobn.onnx'

pb_output_path = './mobilenet_v2_nobn.pb'

# Convert Caffe model to CoreML 
coreml_model = coremltools.converters.caffe.convert((input_caffe_path, proto_file))

# Save CoreML model
coreml_model.save(output_coreml_model)

# Load a Core ML model
coreml_model = coremltools.utils.load_spec(output_coreml_model)

# Convert the Core ML model into ONNX
onnx_model = onnxmltools.convert_coreml(coreml_model)

# Save as protobuf
onnxmltools.utils.save_model(onnx_model, output_onnx_model)

onnx_model = onnx.load(output_onnx_model)  # load onnx model

tf_exp = prepare(onnx_model)  # prepare tf representation

tf_exp.export_graph(pb_output_path)  # export the model

ONNX model file Mobilenet v2 model: https://drive.google.com/drive/folders/1Q7k5iLxa0VYoGueG1mhD6plV2TV4Zlp_?usp=sharing

Python, ONNX, ONNX-TF, Tensorflow version

This section can be obtained by running get_version.py from util folder.

Additional context

The caffemodel with Batchnorm works, but got error after merge_bn.

chinhuang007 commented 4 years ago

The validation error is coming from the standard ONNX checker. Looks like the generated ONNX file is not a fully functional model. The issue is in the coremltools when loading and converting to ONNX format.