zhenhuaw-me / tflite2onnx

Convert TensorFlow Lite models (*.tflite) to ONNX.
https://zhenhuaw.me/tflite2onnx
Apache License 2.0
149 stars 26 forks source link

tflite convert to onnx error:Unsupported TFLite OP: 127 #51

Closed zhaohb closed 3 years ago

zhaohb commented 3 years ago

Describe the bug when I convert a tf model to tflite and use tensorflow op with

  5 converter.target_spec.supported_ops = [tf.lite.OpsSet.TFLITE_BUILTINS,
  6                                        tf.lite.OpsSet.SELECT_TF_OPS]
  7 converter.allow_custom_ops = True

and I managed to get the tflite file, but when convert tflite to onnx ,I get the error:

NotImplementedError: Unsupported TFLite OP: 127

since I think we do not need the concrete implementation of OP in the process of converting to ONNX file, so, Is there a way that I can get the ONNX file normally?

thank you very much

zhaohb commented 3 years ago

https://github.com/jackwish/tflite/blob/master/tflite/BuiltinOperator.py has no 127, why?
How do I know the op type for 127?

zhenhuaw-me commented 3 years ago

Hi, @zhaohb custom op is not supported currently. Please check if your model contains such. If yes, to workaround it, you may need to replace complicate TensorFlow operators with TFLite builtin operators, and then try again.

zhenhuaw-me commented 3 years ago

This error message is somehow confusing, I am going to enhance it in https://github.com/jackwish/tflite2onnx/pull/54.

zhaohb commented 3 years ago

@jackwish thank you very much, but I still have a question. Does TFLite2OnNX support only one graph? I see that the code logic can support multiple graphs at the same time.

        ...
        graph_count = self.model.SubgraphsLength()
        tflg = self.model.Subgraphs(0)
        tflg_2 = self.model.Subgraphs(1)
        graph = Graph(self.model, tflg)
        graph_2 = Graph(self.model, tflg_2)
        self.graphes.append(graph)
        self.graphes.append(graph_2)

Will there be any other problems with this modification?

zhenhuaw-me commented 3 years ago

@zhaohb that's a good question.

We don't support multiple graphs currently for two reasons:

If there are TFLite models that container 2+ graph, we can easily extend tflite2onnx to support it - convert one graph each time invoked with an interface to specify which graph to be converted.

And, of course, we can generate all these graphs at one time. If that is the case, we will introduce another level to wrap up Model rather than Graph.

zhaohb commented 3 years ago

@jackwish But now I have one tflite model that contain two graphs. one graph name is main, another is Noop, I don't know if it has anything to do with add:

   converter.target_spec.supported_ops = [tf.lite.OpsSet.TFLITE_BUILTINS,
                                          tf.lite.OpsSet.SELECT_TF_OPS]
   converter.allow_custom_ops = True

Another question: I now want to test the generated TFlite Model using TF.Lite.Interpreter, but I am encountering the following problems:

RuntimeError: Encountered unresolved custom op: HashTableV2.Node number 1 (HashTableV2) failed to prepare.

This TFlite model is generated by tadding the following code:

   converter.target_spec.supported_ops = [tf.lite.OpsSet.TFLITE_BUILTINS,
                                          tf.lite.OpsSet.SELECT_TF_OPS]
   converter.allow_custom_ops = True

What should I do?

zhenhuaw-me commented 3 years ago

another is Noop

Can you help to clarify what Noop means here? By contain two graphs, did you debug into tflite2onnx to get that info?

For the TFLite running error, I don't know how your model looks like, but it would be proper to ask in TFLite repo or forum or just searching... (TFLite has been widely tried out, there must be someone have similar issue)

zhenhuaw-me commented 3 years ago

Closing this as it's an issue in the TFLite model. The error message will be improved in #54.