onnx / tensorflow-onnx

Convert TensorFlow, Keras, Tensorflow.js and Tflite models to ONNX
Apache License 2.0
2.3k stars 432 forks source link

Unsupported ops: TensorListConcatV2 #2271

Open k-maheshkumar opened 10 months ago

k-maheshkumar commented 10 months ago

Requesting support for the missing Operator "TensorListConcatV2"

Hi, I am getting the following error with opset=18:

Tensorflow op [TensorListConcatV2_4: TensorListConcatV2] is not supported Tensorflow op [TensorListConcatV2_3: TensorListConcatV2] is not supported Tensorflow op [TensorListConcatV2_2: TensorListConcatV2] is not supported Tensorflow op [TensorListConcatV2_1: TensorListConcatV2] is not supported Tensorflow op [TensorListConcatV2: TensorListConcatV2] is not supported Unsupported ops: Counter({'TensorListConcatV2': 5})

A small toy example to reproduce the error:

import tensorflow as tf
import tf2onnx

class ExampleModel(tf.keras.Model):

    def __init__(self, input_shape):

        super(ExampleModel, self).__init__()
        self.build(input_shape)

    def call(self, inputs, training = False):
        ta = tf.TensorArray(tf.int32, size=3)
        ta = ta.write(0, tf.constant([1, 2]))
        ta = ta.write(1, tf.constant([3, 4]))
        ta = ta.write(2, tf.constant([5, 6]))
        return ta.concat()

example_model = ExampleModel((1, 100, 100, 3))
output_path = "./concat_model.onnx"
onnx_model = tf2onnx.convert.from_keras(example_model, (tf.TensorSpec((1, 100, 100, 3), tf.float32, name="input_1"),), opset=18, output_path=output_path)

Toy example output

Tensorflow op [example_model_3/TensorListConcatV2: TensorListConcatV2] is not supported
Unsupported ops: Counter({'TensorListConcatV2': 1})

FYI: In spite of above error/warning, model got saved and with

import onnxruntime as ort
session = ort.InferenceSession(output_path, providers=ort.get_available_providers())

got the following error from onnxruntime:

InvalidGraph: [ONNXRuntimeError] : 10 : INVALID_GRAPH : Load model from [/path/to/concat_model.onnx]( failed:This is an invalid model. In Node, ("example_model/TensorArrayV2", TensorListReserve, "", -1) : ("example_model/TensorArrayV2/element_shape:0": tensor(int32),"example_model/TensorArrayV2/num_elements:0": tensor(int32),) -> ("example_model/TensorArrayV2:0",) , Error No Op registered for TensorListReserve with domain_version of 18

could you please help me?

k-maheshkumar commented 10 months ago

WIth the onnx.checker.check_model() from the example of check_model.ipynb I got

import onnx

onnx_model = onnx.load("./concat_model.onnx")

print("The model is:\n{}".format(onnx_model))

# Check the model
try:
    onnx.checker.check_model(onnx_model)
except onnx.checker.ValidationError as e:
    print("The model is invalid: %s" % e)
else:
    print("The model is valid!")

I got the following output,

The model is:
ir_version: 8
producer_name: "tf2onnx"
producer_version: "1.15.1 37820d"
graph {
  node {
    input: "example_model/TensorArrayV2/element_shape:0"
    input: "example_model/TensorArrayV2/num_elements:0"
    output: "example_model/TensorArrayV2:0"
    name: "example_model/TensorArrayV2"
    op_type: "TensorListReserve"
    attribute {
      name: "element_dtype"
      i: 6
      type: INT
    }
  }
  node {
    input: "example_model/TensorArrayV2:0"
    input: "example_model/TensorArrayV2Write/TensorListSetItem/index:0"
    input: "example_model/Const:0"
    output: "example_model/TensorArrayV2Write/TensorListSetItem:0"
    name: "example_model/TensorArrayV2Write/TensorListSetItem"
    op_type: "TensorListSetItem"
    attribute {
      name: "element_dtype"
      i: 6
      type: INT
    }
    attribute {
      name: "resize_if_index_out_of_bounds"
      i: 0
      type: INT
    }
  }
  node {
    input: "example_model/TensorArrayV2Write/TensorListSetItem:0"
    input: "example_model/TensorArrayV2Write_1/TensorListSetItem/index:0"
    input: "example_model/Const_1:0"
    output: "example_model/TensorArrayV2Write_1/TensorListSetItem:0"
    name: "example_model/TensorArrayV2Write_1/TensorListSetItem"
    op_type: "TensorListSetItem"
    attribute {
      name: "element_dtype"
      i: 6
      type: INT
    }
    attribute {
      name: "resize_if_index_out_of_bounds"
      i: 0
      type: INT
    }
  }
  node {
    input: "example_model/TensorArrayV2Write_1/TensorListSetItem:0"
    input: "example_model/TensorArrayV2Write_2/TensorListSetItem/index:0"
    input: "example_model/Const_2:0"
    output: "example_model/TensorArrayV2Write_2/TensorListSetItem:0"
    name: "example_model/TensorArrayV2Write_2/TensorListSetItem"
    op_type: "TensorListSetItem"
    attribute {
      name: "element_dtype"
      i: 6
      type: INT
    }
    attribute {
      name: "resize_if_index_out_of_bounds"
      i: 0
      type: INT
    }
  }
  node {
    input: "example_model/TensorArrayV2Write_2/TensorListSetItem:0"
    input: "example_model/TensorListConcatV2/element_shape:0"
    input: "example_model/Const_3:0"
    output: "output_1"
    output: "example_model/TensorListConcatV2:1"
    name: "example_model/TensorListConcatV2"
    op_type: "TensorListConcatV2"
    attribute {
      name: "element_dtype"
      i: 6
      type: INT
    }
  }
  name: "tf2onnx"
  initializer {
    dims: 1
    data_type: 6
    name: "example_model/TensorListConcatV2/element_shape:0"
    raw_data: "\377\377\377\377"
  }
  initializer {
    data_type: 6
    name: "example_model/TensorArrayV2Write_2/TensorListSetItem/index:0"
    raw_data: "\002\000\000\000"
  }
  initializer {
    data_type: 6
    name: "example_model/TensorArrayV2Write_1/TensorListSetItem/index:0"
    raw_data: "\001\000\000\000"
  }
  initializer {
    data_type: 6
    name: "example_model/TensorArrayV2Write/TensorListSetItem/index:0"
    raw_data: "\000\000\000\000"
  }
  initializer {
    data_type: 6
    name: "example_model/TensorArrayV2/num_elements:0"
    raw_data: "\003\000\000\000"
  }
  initializer {
    data_type: 6
    name: "example_model/TensorArrayV2/element_shape:0"
    raw_data: "\377\377\377\377"
  }
  initializer {
    dims: 0
    data_type: 7
    name: "example_model/Const_3:0"
    raw_data: ""
  }
  initializer {
    dims: 2
    data_type: 6
    name: "example_model/Const_2:0"
    raw_data: "\005\000\000\000\006\000\000\000"
  }
  initializer {
    dims: 2
    data_type: 6
    name: "example_model/Const_1:0"
    raw_data: "\003\000\000\000\004\000\000\000"
  }
  initializer {
    dims: 2
    data_type: 6
    name: "example_model/Const:0"
    raw_data: "\001\000\000\000\002\000\000\000"
  }
  doc_string: "converted from example_model"
  input {
    name: "input_1"
    type {
      tensor_type {
        elem_type: 1
        shape {
          dim {
            dim_value: 1
          }
          dim {
            dim_value: 100
          }
          dim {
            dim_value: 100
          }
          dim {
            dim_value: 3
          }
        }
      }
    }
  }
  output {
    name: "output_1"
    type {
      tensor_type {
        elem_type: 6
        shape {
          dim {
            dim_param: "unk__6"
          }
        }
      }
    }
  }
}
opset_import {
  domain: ""
  version: 18
}
opset_import {
  domain: "ai.onnx.ml"
  version: 2
}

The model is invalid: No Op registered for TensorListReserve with domain_version of 18

==> Context: Bad node spec for node. Name: example_model/TensorArrayV2 OpType: TensorListReserve
fatcat-z commented 10 months ago

Curious why TensorListReserve exists in the final graph, will take a look at it.

k-maheshkumar commented 10 months ago

Thanks, looking forward to the fix :)

fatcat-z commented 8 months ago

I think I thought about another op other than TensorListReserve when I took the first look at this issue, so I was surprised it exists in the final graph.

TensorListReserve and TensorListConcatV2 ops are actually not supported as expected. Both of them are dedicated TF data manipulation ops which are not those generic ops, like MatMul and Abs, which exist in other DL framework like PyTorch, ONNX.

I will remove them from Support Status doc to avoid further confusions.