huggingface / tflite-android-transformers

DistilBERT / GPT-2 for on-device inference thanks to TensorFlow Lite with Android demo apps
Apache License 2.0
391 stars 81 forks source link

model generated by model_generation but not able to invoke. #5

Open o20021106 opened 4 years ago

o20021106 commented 4 years ago

I tried to convert tensorflow model to tflite using model_generation/distilbert.py.

I was able to convert and save the model without error, but could not allocate_tensor with python API and invoke with interpreter also failed with RuntimeError

RuntimeError: Regular TensorFlow ops are not supported by this interpreter. Make sure you apply/link the Flex delegate before inference.Node number 0 (FlexShape) failed to prepare.

What should I do to fix this error?

Here's my colab to reproduce the error colab tf-nightly-gpu==2.2.0.dev20200115

fuzhenxin commented 4 years ago

@o20021106 Hello, I met the same problem. Have you solved it? Thanks!

fuzhenxin commented 4 years ago

My problem was solved by changing supported_ops in python to converter.target_spec.supported_ops = [tf.lite.OpsSet.TFLITE_BUILTINS, tf.lite.OpsSet.SELECT_TF_OPS]

bartekcensorpic commented 4 years ago

My problem was solved by changing supported_ops in python to converter.target_spec.supported_ops = [tf.lite.OpsSet.TFLITE_BUILTINS, tf.lite.OpsSet.SELECT_TF_OPS]

Yup, helped. Thanks!

songdlut commented 2 years ago

My problem was solved by changing supported_ops in python to converter.target_spec.supported_ops = [tf.lite.OpsSet.TFLITE_BUILTINS, tf.lite.OpsSet.SELECT_TF_OPS]

Can I continue to use this package?tensorflow-lite-with-select-tf-ops-0.0.0-nightly.aar