Open anovis opened 3 years ago
@anovis Can you try downloading the flex binaries from here.
android/app/src/main/jniLibs/arm64-v8a
with libtensorflowlite_c_arm64_flex.so. android/app/src/main/jniLibs/armeabi-v7a
@am15h Thanks for the help. I replaced those binaries and got the same error. Also for more context I am running on an android emulater on ubuntu flutter channel 1.20.4
and tflite_flutter: ^0.5.0
.
Please try running on a real device. Flex binaries are not available of emulators yet.
On Sat, 17 Apr 2021, 19:31 Austen, @.***> wrote:
@am15h https://github.com/am15h Thanks for the help. I replaced those binaries and got the same error. Also for more context I am running on an android emulater on ubuntu flutter channel 1.20.4 and tflite_flutter: ^0.5.0.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/am15h/tflite_flutter_plugin/issues/101#issuecomment-821827366, or unsubscribe https://github.com/notifications/unsubscribe-auth/AC6ZDYCRCPKCWIOA4DMHWATTJGIEJANCNFSM43APMEMA .
Thanks so ran on my android device and recieved
I/tflite (31894): Initialized TensorFlow Lite runtime.
I/tflite (31894): Created TensorFlow Lite delegate for select TF ops.
I/tflite (31894): TfLiteFlexDelegate delegate: 7 nodes delegated out of 284 nodes with 3 partitions.
E/tflite (31894): Op type not registered 'SentencepieceOp' in binary running on localhost. Make sure the Op and Kernel are registered in the binary running in this process. Note that if you are loading a saved graph which used ops from tf.contrib, accessing (e.g.) `tf.contrib.resampler` should be done before importing the graph, as contrib ops are lazily registered when the module is first accessed.
E/tflite (31894): Delegate kernel was not initialized
E/tflite (31894): Node number 284 (TfLiteFlexDelegate) failed to prepare.
Looking at that error it looks like it is missing tensorflow_text
binaries https://github.com/tensorflow/hub/issues/463
Can you try following these steps while converting to tflite? https://www.tensorflow.org/lite/guide/op_select_allowlist#tensorflow_text_and_sentencepiece_operators
i had originally build the model using
converter = tf.lite.TFLiteConverter.from_saved_model(saved_model_dir)
converter.target_spec.supported_ops = [
tf.lite.OpsSet.TFLITE_BUILTINS, # enable TensorFlow Lite ops.
tf.lite.OpsSet.SELECT_TF_OPS # enable TensorFlow ops.
]
tflite_model = converter.convert()
The only difference I can see is import sentencepiece as spm
, so I will go back and add that
@anovis, were you able to get it working?
@anovis i have made a fork, modified it to have an interpreter option to add flex delegate. It's a little 'hacky' solution for the android platform based on existing flex JNI libs.
@cpoohee thanks! I will try that later this week and report back!
@anovis Have you solved your problem?
I've found out about this https://www.tensorflow.org/lite/inference_with_metadata/lite_support#getting_started
I've tried using
implementation 'org.tensorflow:tensorflow-lite-support:0.0.0-nightly-SNAPSHOT'
which has this
https://github.com/tensorflow/tflite-support/tree/master/tensorflow_lite_support/custom_ops/kernel/sentencepiece
but nothing happened
@anovis Hi! Any updates on your issue? :)
Please try running on a real device. Flex binaries are not available of emulators yet.
@am15h do we have flex binaries for x86?
I am able to make it work on phones (ARM) after replacing .so
files, but it is still not working on emulator (x86).
Hello @am15h ! Is it possible to have Flex binaries on emulator now? If not, how to work around this problem?
Hello,
I tried to implement autocomplete example from google(https://github.com/tensorflow/examples/tree/master/lite/examples/generative_ai/android) on flutter, but tf_ops error pops up for me too during debug session. There is no error on android native app from google examples.
Here is the brief conversion code: @tf.function def generate(prompt, max_length): return gpt2_lm.generate(prompt, max_length)
concrete_func = generate.get_concrete_function(tf.TensorSpec([], tf.string), 100)
gpt2_lm.jit_compile = False converter = tf.lite.TFLiteConverter.from_concrete_functions([concrete_func], gpt2_lm) converter.target_spec.supported_ops = [ tf.lite.OpsSet.TFLITE_BUILTINS, # enable TensorFlow Lite ops. tf.lite.OpsSet.SELECT_TF_OPS # enable TensorFlow ops. ] converter.allow_custom_ops = True converter.target_spec.experimental_select_user_tf_ops = ["UnsortedSegmentJoin", "UpperBound"] converter._experimental_guarantee_all_funcs_one_use = True generate_tflite = converter.convert()
I think that there is no support for experimental tf_ops right now at tflite_flutter plugin. Could you please support me ?
Getting the error
Regular TensorFlow ops are not supported by this interpreter.
Same as #63I am also using a converted tensorflow2.0 text model, which required using SELECT_TF_OPS while doing the conversion
I came across https://www.tensorflow.org/lite/guide/ops_select#android_aar, which suggests adding a dependency to
build.graddle
, which i tried and got the same error.Wondering if anyone else ran into this and suggestions for how I could proceed.