Closed Songyun-Tao closed 3 years ago
Hi @Derricktao sorry, but for this tflite conversion issue, please open an issue here: https://github.com/tensorflow/tensorflow/issues for a more appropriate answer
@Derricktao Did you check with the TensorFlow team regarding this issue ?
Closing this due to lack of activity. Feel free to reopen this thread if the issue still persists.
HI, I tried to convert my pytorch model to a deployable tflite model on my coral edge TPU: My environment are: PyTorch: 1.7.1 Onnx: 1.8.0 Onnx_tf: 1.7.0 Tensorflow: 2.3.0 Python: 3.8.5 Desktop: Ubuntu 20.04 Jupyter Lab: 3.0.1
And my torch model architecture is:
I followed the instructions and I successfully converted from torch to onnx and to tensorflow. However, when it comes to the tflite model, I met some problem. The first method is:
During the conversion, the page froze and evetually gave me an error:
Javascript Error: too much recurtion
I also tried run a python script instead of in a Jupyter notebook, the terminal output seems indicate a successful conversion, but there is no tflite file generated. The output is:When I use another method, the tflite model seem converted successful,
But since this is not quantized, when I deployed the model on my Coral USB TPU, it stated:
ERROR: Regular TensorFlow ops are not supported by this interpreter. Make sure you apply/link the Flex delegate before inference.
Which makes sense since I addedtf.lite.OpsSet.SELECT_TF_OPS
. And I used the edgetpu compiler, which gave me the same error. It seems I can only convert my tensorflow to tflite model successfully using this parameter 'tf.lite.OpsSet.SELECT_TF_OPS' (other wise I will have a wierd Javascript Error. I am kinda stuck between a unsupported, yet converted tflite model and a wierd error that did not even generated a tflite model.I also tried a definetaly unreasonable combination:
Which gave me a tflite model, with unsupported ops and uint8 as i/o
Does anyone met this problem and know how to solve this? Thank you!