Closed zhaohb closed 3 years ago
https://github.com/jackwish/tflite/blob/master/tflite/BuiltinOperator.py has no 127, why?
How do I know the op type for 127?
Hi, @zhaohb custom op is not supported currently. Please check if your model contains such. If yes, to workaround it, you may need to replace complicate TensorFlow operators with TFLite builtin operators, and then try again.
This error message is somehow confusing, I am going to enhance it in https://github.com/jackwish/tflite2onnx/pull/54.
@jackwish thank you very much, but I still have a question. Does TFLite2OnNX support only one graph? I see that the code logic can support multiple graphs at the same time.
...
graph_count = self.model.SubgraphsLength()
tflg = self.model.Subgraphs(0)
tflg_2 = self.model.Subgraphs(1)
graph = Graph(self.model, tflg)
graph_2 = Graph(self.model, tflg_2)
self.graphes.append(graph)
self.graphes.append(graph_2)
Will there be any other problems with this modification?
@zhaohb that's a good question.
We don't support multiple graphs currently for two reasons:
If there are TFLite models that container 2+ graph, we can easily extend tflite2onnx
to support it - convert one graph each time invoked with an interface to specify which graph to be converted.
And, of course, we can generate all these graphs at one time. If that is the case, we will introduce another level to wrap up Model
rather than Graph
.
@jackwish But now I have one tflite model that contain two graphs. one graph name is main, another is Noop, I don't know if it has anything to do with add:
converter.target_spec.supported_ops = [tf.lite.OpsSet.TFLITE_BUILTINS,
tf.lite.OpsSet.SELECT_TF_OPS]
converter.allow_custom_ops = True
Another question: I now want to test the generated TFlite Model using TF.Lite.Interpreter, but I am encountering the following problems:
RuntimeError: Encountered unresolved custom op: HashTableV2.Node number 1 (HashTableV2) failed to prepare.
This TFlite model is generated by tadding the following code:
converter.target_spec.supported_ops = [tf.lite.OpsSet.TFLITE_BUILTINS,
tf.lite.OpsSet.SELECT_TF_OPS]
converter.allow_custom_ops = True
What should I do?
another is Noop
Can you help to clarify what Noop means here? By contain two graphs, did you debug into tflite2onnx
to get that info?
For the TFLite running error, I don't know how your model looks like, but it would be proper to ask in TFLite repo or forum or just searching... (TFLite has been widely tried out, there must be someone have similar issue)
Closing this as it's an issue in the TFLite model. The error message will be improved in #54.
Describe the bug when I convert a tf model to tflite and use tensorflow op with
and I managed to get the tflite file, but when convert tflite to onnx ,I get the error:
since I think we do not need the concrete implementation of OP in the process of converting to ONNX file, so, Is there a way that I can get the ONNX file normally?
thank you very much