onnx / tensorflow-onnx

Convert TensorFlow, Keras, Tensorflow.js and Tflite models to ONNX
Apache License 2.0
2.32k stars 432 forks source link

Value "name: "__inference_pruned_1633" " is not valid attribute data type. #940

Closed rogordan closed 3 years ago

rogordan commented 4 years ago

My interpretation is that this model will simply not be supported by ONNX, is there a way for me to verify this?

System

Win10 TF 2.1 tf2onnx 1.6

Task

I am trying to convert the following tfhub model to onnx: https://tfhub.dev/google/universal-sentence-encoder/4

Running the following command:

python -m tf2onnx.convert --saved-model .\universal-sentence-encoder_4\ --output model.onnx --opset 11 --verbose

Lots of bad output, including:

2020-05-28 15:37:57,974 - WARNING - tf2onnx.shape_inference: Shape of placeholder statefulpartitionedcall_args_145 is unknown, treated it as a scalar

But what crashes the process is:

ex=Value "name: "__inference_pruned_2009" " is not valid attribute data type.

Full Output: tf2onnx-output.txt

jignparm commented 4 years ago

...tfhub model to onnx... ...statefulpartitionedcall_args_145...

The tfhub saved_models are known to have some conversion difficulties, especially for Keras/TF2 models.

Will update this thread shortly.

rogordan commented 4 years ago

There are actually some older TF1 models on the same tfhub page, that also would serve my purpose, but with those I ran into a different problem.. there is no metagraph. I think I managed to add a metagraph by loading in the model using tf1 but then there are some other issues. There was a failure that mentioned a regex op.

jignparm commented 4 years ago

There was a failure that mentioned a regex op.

If there's an operator (e.g. regex operator) that's not implemented in Onnx, you can add a custom-op (see the README) to convert the model. The main drawback is that the converted Onnx model will not run under an Onnx-based runtime -- the op needs to be implemented in the runtime to load/run the model successfully.

Another option is to compose the operator using elementary Onnx ops. 'regex' probably cannot be reduced to the ops in the latest opset however.

tbaptista commented 4 years ago

Having the same issue trying to convert the Universal Sentence Encoder to ONNX.

Is there any update as to the underlying problem? Is it some operator not yet supported in ONNX, or a problem with the conversion?

guschmue commented 4 years ago

this is comes from a StatefulPartitionedCall op. We ask grappler to inline those but for some reason this doesn't work in this case (first model I see that has this). I think we'd need to inline those our self. I fear even if we do this we'll have issues - as far I know this is using tf.text which onnx does not have an equivalent for.

attr {
  key: "f"
  value {
    func {
      name: "__inference_pruned_2009"
    }
  }
}
rogordan commented 4 years ago

My guess is the most problematic ops beyond the one in this thread will be the sentence piece encoder which is built into the tfhub model. It may be possible to find the op immediately after the encoder and convert only that section.

Regex would be one of the easiest things to reimplements since the graph nodes seem to list the match and replace.

For our case we thought about transferring the weights to a pytorch model but decided to save the engineering effort trying to figure out how to do this and instead find another alternative that plays better with ONNX.

guschmue commented 3 years ago

we fixed this some time ago.