am15h / tflite_flutter_plugin

TensorFlow Lite Flutter Plugin
https://pub.dev/packages/tflite_flutter
Apache License 2.0
508 stars 353 forks source link

Regular TensorFlow ops are not supported by this interpreter. Make sure you apply/link the Flex delegate before inference #101

Open anovis opened 3 years ago

anovis commented 3 years ago

Getting the error Regular TensorFlow ops are not supported by this interpreter. Same as #63

I/tflite  ( 7126): Initialized TensorFlow Lite runtime.
E/tflite  ( 7126): Regular TensorFlow ops are not supported by this interpreter. Make sure you apply/link the Flex delegate before inference.
E/tflite  ( 7126): Node number 0 (FlexSentencepieceOp) failed to prepare.
E/flutter ( 7126): [ERROR:flutter/lib/ui/ui_dart_state.cc(166)] Unhandled Exception: Bad state: failed precondition

I am also using a converted tensorflow2.0 text model, which required using SELECT_TF_OPS while doing the conversion

converter = tf.lite.TFLiteConverter.from_saved_model(saved_model_dir)
converter.target_spec.supported_ops = [
  tf.lite.OpsSet.TFLITE_BUILTINS, # enable TensorFlow Lite ops.
  tf.lite.OpsSet.SELECT_TF_OPS # enable TensorFlow ops.
]
tflite_model = converter.convert()

I came across https://www.tensorflow.org/lite/guide/ops_select#android_aar, which suggests adding a dependency to build.graddle , which i tried and got the same error.

dependencies {
    implementation 'org.tensorflow:tensorflow-lite:0.0.0-nightly-SNAPSHOT'
    // This dependency adds the necessary TF op support.
    implementation 'org.tensorflow:tensorflow-lite-select-tf-ops:0.0.0-nightly-SNAPSHOT'
}

Wondering if anyone else ran into this and suggestions for how I could proceed.

am15h commented 3 years ago

@anovis Can you try downloading the flex binaries from here.

  1. Replace libtensorflowlite_c.so in android/app/src/main/jniLibs/arm64-v8a with libtensorflowlite_c_arm64_flex.so.
  2. Now rename libtensorflowlite_c_arm64_flex.so to libtensorflowlite_c.so
  3. Same steps for libtensorflowlite_c_arm_flex.so. and android/app/src/main/jniLibs/armeabi-v7a
anovis commented 3 years ago

@am15h Thanks for the help. I replaced those binaries and got the same error. Also for more context I am running on an android emulater on ubuntu flutter channel 1.20.4 and tflite_flutter: ^0.5.0.

am15h commented 3 years ago

Please try running on a real device. Flex binaries are not available of emulators yet.

On Sat, 17 Apr 2021, 19:31 Austen, @.***> wrote:

@am15h https://github.com/am15h Thanks for the help. I replaced those binaries and got the same error. Also for more context I am running on an android emulater on ubuntu flutter channel 1.20.4 and tflite_flutter: ^0.5.0.

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/am15h/tflite_flutter_plugin/issues/101#issuecomment-821827366, or unsubscribe https://github.com/notifications/unsubscribe-auth/AC6ZDYCRCPKCWIOA4DMHWATTJGIEJANCNFSM43APMEMA .

anovis commented 3 years ago

Thanks so ran on my android device and recieved

I/tflite  (31894): Initialized TensorFlow Lite runtime.
I/tflite  (31894): Created TensorFlow Lite delegate for select TF ops.
I/tflite  (31894): TfLiteFlexDelegate delegate: 7 nodes delegated out of 284 nodes with 3 partitions.
E/tflite  (31894): Op type not registered 'SentencepieceOp' in binary running on localhost. Make sure the Op and Kernel are registered in the binary running in this process. Note that if you are loading a saved graph which used ops from tf.contrib, accessing (e.g.) `tf.contrib.resampler` should be done before importing the graph, as contrib ops are lazily registered when the module is first accessed.
E/tflite  (31894): Delegate kernel was not initialized
E/tflite  (31894): Node number 284 (TfLiteFlexDelegate) failed to prepare.
anovis commented 3 years ago

Looking at that error it looks like it is missing tensorflow_text binaries https://github.com/tensorflow/hub/issues/463

am15h commented 3 years ago

Can you try following these steps while converting to tflite? https://www.tensorflow.org/lite/guide/op_select_allowlist#tensorflow_text_and_sentencepiece_operators

anovis commented 3 years ago

i had originally build the model using

converter = tf.lite.TFLiteConverter.from_saved_model(saved_model_dir)
converter.target_spec.supported_ops = [
  tf.lite.OpsSet.TFLITE_BUILTINS, # enable TensorFlow Lite ops.
  tf.lite.OpsSet.SELECT_TF_OPS # enable TensorFlow ops.
]
tflite_model = converter.convert()

The only difference I can see is import sentencepiece as spm, so I will go back and add that

am15h commented 3 years ago

@anovis, were you able to get it working?

cpoohee commented 3 years ago

@anovis i have made a fork, modified it to have an interpreter option to add flex delegate. It's a little 'hacky' solution for the android platform based on existing flex JNI libs.

anovis commented 3 years ago

@cpoohee thanks! I will try that later this week and report back!

Extremesarova commented 3 years ago

@anovis Have you solved your problem? I've found out about this https://www.tensorflow.org/lite/inference_with_metadata/lite_support#getting_started I've tried using implementation 'org.tensorflow:tensorflow-lite-support:0.0.0-nightly-SNAPSHOT' which has this https://github.com/tensorflow/tflite-support/tree/master/tensorflow_lite_support/custom_ops/kernel/sentencepiece but nothing happened

Extremesarova commented 3 years ago

@anovis Hi! Any updates on your issue? :)

clive107 commented 2 years ago

Please try running on a real device. Flex binaries are not available of emulators yet.

@am15h do we have flex binaries for x86? I am able to make it work on phones (ARM) after replacing .so files, but it is still not working on emulator (x86).

AntoineChauviere commented 2 years ago

Hello @am15h ! Is it possible to have Flex binaries on emulator now? If not, how to work around this problem?

galaturka commented 1 year ago

Hello,

I tried to implement autocomplete example from google(https://github.com/tensorflow/examples/tree/master/lite/examples/generative_ai/android) on flutter, but tf_ops error pops up for me too during debug session. There is no error on android native app from google examples.

Here is the brief conversion code: @tf.function def generate(prompt, max_length): return gpt2_lm.generate(prompt, max_length)

concrete_func = generate.get_concrete_function(tf.TensorSpec([], tf.string), 100)

gpt2_lm.jit_compile = False converter = tf.lite.TFLiteConverter.from_concrete_functions([concrete_func], gpt2_lm) converter.target_spec.supported_ops = [ tf.lite.OpsSet.TFLITE_BUILTINS, # enable TensorFlow Lite ops. tf.lite.OpsSet.SELECT_TF_OPS # enable TensorFlow ops. ] converter.allow_custom_ops = True converter.target_spec.experimental_select_user_tf_ops = ["UnsortedSegmentJoin", "UpperBound"] converter._experimental_guarantee_all_funcs_one_use = True generate_tflite = converter.convert()

I think that there is no support for experimental tf_ops right now at tflite_flutter plugin. Could you please support me ?