Open o20021106 opened 4 years ago
@o20021106 Hello, I met the same problem. Have you solved it? Thanks!
My problem was solved by changing supported_ops in python to converter.target_spec.supported_ops = [tf.lite.OpsSet.TFLITE_BUILTINS, tf.lite.OpsSet.SELECT_TF_OPS]
My problem was solved by changing supported_ops in python to
converter.target_spec.supported_ops = [tf.lite.OpsSet.TFLITE_BUILTINS, tf.lite.OpsSet.SELECT_TF_OPS]
Yup, helped. Thanks!
My problem was solved by changing supported_ops in python to
converter.target_spec.supported_ops = [tf.lite.OpsSet.TFLITE_BUILTINS, tf.lite.OpsSet.SELECT_TF_OPS]
Can I continue to use this package?tensorflow-lite-with-select-tf-ops-0.0.0-nightly.aar
I tried to convert tensorflow model to tflite using
model_generation/distilbert.py
.I was able to convert and save the model without error, but could not allocate_tensor with python API and invoke with interpreter also failed with RuntimeError
RuntimeError: Regular TensorFlow ops are not supported by this interpreter. Make sure you apply/link the Flex delegate before inference.Node number 0 (FlexShape) failed to prepare.
What should I do to fix this error?
Here's my colab to reproduce the error colab tf-nightly-gpu==2.2.0.dev20200115