onnx / onnx-tensorflow

Tensorflow Backend for ONNX
Other
1.28k stars 297 forks source link

Tensors in list passed to 'values' of 'ConcatV2' Op have types [float64, int64, int64, int64, int64] that don't all match. #610

Open ansh1204 opened 4 years ago

ansh1204 commented 4 years ago

I am trying to convert a custom pytorch model to tensorflow, I am abe to convert pytorch to onnx but converting onnx to tensorflow gives issue.

The code snippets are as follows-

pytorch to onnx

net = custom pytorch model net.load_state_dict("pre-trained model") dummyInput = np.random.uniform(0,1,(1,8,3,256,256)) dummyInput = Variable(torch.FloatTensor(dummyInput)) torch.onnx.export(net, dummyInput, './modelONNX.onnx', input_names = ['test_input'], output_names = ['test_output'])

onnx to tensorflow

import onnx from onnx_tf.backend import prepare model = onnx.load('./modelONNX_Variable.onnx') tf_prep = prepare(model) tf_prep.export_graph('./modelTF.pb')

Error

Traceback (most recent call last): File "conversionToTF.py", line 9, in tf_prep = prepare(model) File "/onnx-tensorflow/onnx_tf/backend.py", line 65, in prepare return cls.onnx_model_to_tensorflow_rep(model, strict) File "/onnx-tensorflow/onnx_tf/backend.py", line 85, in onnx_model_to_tensorflow_rep return cls._onnx_graph_to_tensorflow_rep(model.graph, opset_import, strict) File "/onnx-tensorflow/onnx_tf/backend.py", line 143, in _onnx_graph_to_tensorflow_rep onnx_node, tensor_dict, handlers, opset=opset, strict=strict) File "/onnx-tensorflow/onnx_tf/backend.py", line 245, in _onnx_node_to_tensorflow_op return handler.handle(node, tensor_dict=tensor_dict, strict=strict) File "/onnx-tensorflow/onnx_tf/handlers/handler.py", line 61, in handle return ver_handle(node, kwargs) File "/onnx-tensorflow/onnx_tf/handlers/backend/concat.py", line 23, in version_4 return cls._common(node, kwargs) File "/onnx-tensorflow/onnx_tf/handlers/backend/concat.py", line 15, in _common return [cls.make_tensor_from_onnx_node(node, inputs=[inputs])] File "/onnx-tensorflow/onnx_tf/handlers/backend_handler.py", line 111, in make_tensor_from_onnx_node return cls._run_tf_func(tf_func, inputs, attrs) File "/gpfs-volume/onnx-tensorflow/onnx_tf/handlers/backend_handler.py", line 182, in _run_tf_func *dict([(p, attrs[p]) for p in params if p in attrs])) File "/home/user/.local/lib/python3.6/site-packages/tensorflow_core/python/util/dispatch.py", line 180, in wrapper return target(args, **kwargs) File "/home/user/.local/lib/python3.6/site-packages/tensorflow_core/python/ops/array_ops.py", line 1431, in concat return gen_array_ops.concat_v2(values=values, axis=axis, name=name) File "/home/user/.local/lib/python3.6/site-packages/tensorflow_core/python/ops/gen_array_ops.py", line 1257, in concat_v2 "ConcatV2", values=values, axis=axis, name=name) File "/home/user/.local/lib/python3.6/site-packages/tensorflow_core/python/framework/op_def_library.py", line 499, in _apply_op_helper raise TypeError("%s that don't all match." % prefix) TypeError: Tensors in list passed to 'values' of 'ConcatV2' Op have types [float64, int64, int64, int64, int64] that don't all match.

Version - Tensorflow = 2.0.0 onnx = 1.6.0 tensorflow-addons = 0.6.0

KrishnaRekapalli commented 4 years ago

@chinhuang007 I want to give a shot at this! This is my first time working on this repo.

chinhuang007 commented 4 years ago

@KrishnaRekapalli Definitely! Thanks very much for helping out! And Feel free to ask questions.

KrishnaRekapalli commented 4 years ago

@ansh1204 Can you post the example NN you used! I am trying to reproduce the issue and was not able to do it with my example network. Thanks :)

zhongxing9006 commented 4 years ago

You can upgrade your pytorch to 1.6.

hgandhi2411 commented 4 years ago

@chinhuang007 @ansh1204 were you able to figure this out? I get a similar error with this code and I would really appreciate help with this. Thanks.

I'm using this code:

import onnx, onnx_tf
model_path = 'optimal_ala2_model.onnx'
onnx_model = onnx.load(model_path)
tf_rep = onnx_tf.backend.prepare(onnx_model)
tf_prep.export_graph('./ala_model.pb')

and this is the error I get:

Traceback (most recent call last):
  File "test.py", line 7, in <module>
    tf_rep.export_graph('./model_tF.pb')
  File "/home/heta/Documents/White lab/onnx-tensorflow/onnx_tf/backend_rep.py", line 102, in export_graph
    tf.saved_model.save(self.tf_module, path, signatures=self.tf_module.__call__.get_concrete_function(**self.signatures))
  File "/home/heta/miniconda3/envs/htf/lib/python3.7/site-packages/tensorflow/python/eager/def_function.py", line 1167, in get_concrete_function
    concrete = self._get_concrete_function_garbage_collected(*args, **kwargs)
  File "/home/heta/miniconda3/envs/htf/lib/python3.7/site-packages/tensorflow/python/eager/def_function.py", line 1073, in _get_concrete_function_garbage_collected
    self._initialize(args, kwargs, add_initializers_to=initializers)
  File "/home/heta/miniconda3/envs/htf/lib/python3.7/site-packages/tensorflow/python/eager/def_function.py", line 697, in _initialize
    *args, **kwds))
  File "/home/heta/miniconda3/envs/htf/lib/python3.7/site-packages/tensorflow/python/eager/function.py", line 2855, in _get_concrete_function_internal_garbage_collected
    graph_function, _, _ = self._maybe_define_function(args, kwargs)
  File "/home/heta/miniconda3/envs/htf/lib/python3.7/site-packages/tensorflow/python/eager/function.py", line 3213, in _maybe_define_function
    graph_function = self._create_graph_function(args, kwargs)
  File "/home/heta/miniconda3/envs/htf/lib/python3.7/site-packages/tensorflow/python/eager/function.py", line 3075, in _create_graph_function
    capture_by_value=self._capture_by_value),
  File "/home/heta/miniconda3/envs/htf/lib/python3.7/site-packages/tensorflow/python/framework/func_graph.py", line 986, in func_graph_from_py_func
    func_outputs = python_func(*func_args, **func_kwargs)
  File "/home/heta/miniconda3/envs/htf/lib/python3.7/site-packages/tensorflow/python/eager/def_function.py", line 600, in wrapped_fn
    return weak_wrapped_fn().__wrapped__(*args, **kwds)
  File "/home/heta/miniconda3/envs/htf/lib/python3.7/site-packages/tensorflow/python/eager/function.py", line 3735, in bound_method_wrapper
    return wrapped_fn(*args, **kwargs)
  File "/home/heta/miniconda3/envs/htf/lib/python3.7/site-packages/tensorflow/python/framework/func_graph.py", line 973, in wrapper
    raise e.ag_error_metadata.to_exception(e)
ValueError: in user code:

    /home/heta/Documents/White lab/onnx-tensorflow/onnx_tf/backend_tf_module.py:46 __call__  *
        output_ops = self.backend._onnx_node_to_tensorflow_op(onnx_node,
    /home/heta/Documents/White lab/onnx-tensorflow/onnx_tf/backend.py:268 _onnx_node_to_tensorflow_op  *
        return handler.handle(node, tensor_dict=tensor_dict, strict=strict)
    /home/heta/Documents/White lab/onnx-tensorflow/onnx_tf/handlers/handler.py:59 handle  *
        return ver_handle(node, **kwargs)
    /home/heta/Documents/White lab/onnx-tensorflow/onnx_tf/handlers/backend/concat.py:27 version_11  *
        return cls._common(node, **kwargs)
    /home/heta/Documents/White lab/onnx-tensorflow/onnx_tf/handlers/backend/concat.py:15 _common  *
        return [cls.make_tensor_from_onnx_node(node, inputs=[inputs])]
    /home/heta/Documents/White lab/onnx-tensorflow/onnx_tf/handlers/backend_handler.py:111 make_tensor_from_onnx_node  *
        return cls._run_tf_func(tf_func, inputs, attrs)
    /home/heta/Documents/White lab/onnx-tensorflow/onnx_tf/handlers/backend_handler.py:188 _run_tf_func  *
        return tf_func(**kwargs)
    /home/heta/miniconda3/envs/htf/lib/python3.7/site-packages/tensorflow/python/util/dispatch.py:201 wrapper  **
        return target(*args, **kwargs)
    /home/heta/miniconda3/envs/htf/lib/python3.7/site-packages/tensorflow/python/ops/array_ops.py:1654 concat
        return gen_array_ops.concat_v2(values=values, axis=axis, name=name)
    /home/heta/miniconda3/envs/htf/lib/python3.7/site-packages/tensorflow/python/ops/gen_array_ops.py:1222 concat_v2
        "ConcatV2", values=values, axis=axis, name=name)
    /home/heta/miniconda3/envs/htf/lib/python3.7/site-packages/tensorflow/python/framework/op_def_library.py:744 _apply_op_helper
        attrs=attr_protos, op_def=op_def)
    /home/heta/miniconda3/envs/htf/lib/python3.7/site-packages/tensorflow/python/framework/func_graph.py:593 _create_op_internal
        compute_device)
    /home/heta/miniconda3/envs/htf/lib/python3.7/site-packages/tensorflow/python/framework/ops.py:3485 _create_op_internal
        op_def=op_def)
    /home/heta/miniconda3/envs/htf/lib/python3.7/site-packages/tensorflow/python/framework/ops.py:1975 __init__
        control_input_ops, op_def)
    /home/heta/miniconda3/envs/htf/lib/python3.7/site-packages/tensorflow/python/framework/ops.py:1815 _create_c_op
        raise ValueError(str(e))

    ValueError: Shape must be rank 1 but is rank 2 for '{{node Concat_9}} = ConcatV2[N=2, T=DT_FLOAT, Tidx=DT_INT32](Constant_0, Sqrt_8, Concat_9/axis)' with input shapes: [0], [1,10], [].

TF version: 2.3.0 Python: 3.7.7 Tensorflow addons: 0.11.2

This is the onnx model I have: onnx_model.zip

chinhuang007 commented 4 years ago

@hgandhi2411 The model has a constant node "Constant_0" with no content for attribute 'value'. The output of this constant node, which has shape of [0], leads to the concat node conversion issue, because of the input tensors must have the same shape.

The solution is to recreate the onnx model file with proper constant values for the constant node.