Open vishnukvmd opened 1 year ago
Adding your PR https://github.com/tensorflow/flutter-tflite/pull/42 here for tracking :) I almost responded this morning mentioning it while going over this weekend's tasks lol.
Thanks again for putting that together!
I have done some tests with this. First, for Android include this dependency: 'org.tensorflow:tensorflow-lite-select-tf-ops:2.12.0'. For iOS include pod 'TensorFlowLiteSelectTfOps'. Then in the binding there is a method that allows creating the interpreter using SelectedOps: TfLiteInterpreterCreateWithSelectedOps
. Still I have not been able to load the models that I have tested, depending on the model it gives errors of this type: Didn't find op for builtin opcode 'ADD' version '1'. The other issue is that these dependencies add a lot of size to the final application, it would be better not to include them in the package since not everyone is going to use this feature and instead either create an independent package just to include these dependencies or document in the readme how the process would be to include them in the application. Modifications would also have to be made to the interpreter so that it calls the binding TfLiteInterpreterCreateWithSelectedOps
when necessary. If I manage to solve the problem with the version of the operators I will make a PR.
hey I am getting this error:
Select TensorFlow op(s), included in the given model, is(are) not supported by this interpreter. Make sure you apply/link the Flex delegate before inference. For the Android, it can be resolved by adding "org.tensorflow:tensorflow-lite-select-tf-ops" dependency. See instructions: https://www.tensorflow.org/lite/guide/ops_select
I think this is related to this issue, do you guys have an Idea how we can resolve this? Thanks in adv
I am also trying to enable flex delegates because my model requires select tensorflow ops and have so far been unable to do so. With tflite_flutter, you can specify a delegate:
final interpreter = await Interpreter.fromAsset(
assetname,
options: InterpreterOptions()..addDelegate(delegate));
But tflite_flutter doesn't expose a way to instantiate the Flex delegate, probably because it's included in a separate dependency. Regardless, The documentation from Tensorflow suggests that the flex delegate will be automatically installed on the interpreter as long as the shared library is linked, and that you don't need to include the flex delegate in the interpreter options. But adding implementation 'org.tensorflow:tensorflow-lite-select-tf-ops:2.12.0'
to my android/app/build.gradle doesn't appear to change anything. I still get the "make sure to apply/link the flex delegate before inference" message.
If anyone has a code example of running the interpreter with a flex delegate installed, I'd love to see it. In the meantime, the only alternative I can think of is to stop using tflite_flutter and write some platform-specific code for interacting with tflite behind a platform channel.
@ddeklotz I am experiencing the same problem. Were you able to resolve this issue yet?
Could it be that the TFLite dynamic library used in the plugin doesn't have flex support?
Something like this tensorflowlite-flex.so
built from the bazel command in this comment -> https://github.com/tensorflow/tensorflow/issues/57822#issuecomment-1257127667
hey @ddeklotz @parnurzeal , i'm facing the same probleme, did you manage to solve it ? thx
@Elienvalleau @ddeklotz @parnurzeal, i am facing this problem as well, are there any updates on this ?
Hi @luiscib3r did you you get a chance to advance on this since May ? How can we help ?
Negative @flutter-painter. I tried a few things but couldn't get it to work.
Thanks for your quick reply. This is very limiting. Two other approaches I see for ML in Flutter are :
A while ago i managed to get Flex delegate work for Android. See my fork. I had some description on how I did. My use case was to cover string inputs and outputs. Also, i am not confident that I can replicate the results with the latest versions. If i remembered, what i did was to emulate java's implementation.
As for IOS, i encountered unreliable library linkings that required some non-standard editing of files.
What i could list was those from my old notes from aug 2021:
Since we are using our own git tflite flutter projects, we need to put the ios binaries into git pub-cached folders instead of official tflite flutter folder.
- run pub get for the project to fetch relevant dependancies.
- look for your ./pub-cache folder
- place both TensorFlowLiteSelectTfOps.framework and TensorFlowLiteC.framework into e.g ".pub-cache/git/tflite_flutter_plugin-5f521f724554b45d88cb630c9da842c079a249b0"
- as a work around, edit the Pod-Runner.debug.xcconfig file(or release) at the location: ios/Pods/Target Support Files/Pods-Runner/
- add the line under OTHER_LDFLAGS =, -framework "TensorFlowLiteSelectTfOps"
- delete line under OTHER_LDFLAGS, -framework "TensorFlowLiteC"
- Unfortunately this uses the fat version of the tflite api. my custom compiled binaries doesn't work.
- There is a limitation to tflite binaries, x86_64 framework doesn't come with armv7 and arm64. Use the appropriate version when deploying.
- See files TensorFlowLiteSelectTfOps_framework_fat_x86_64.framework, TensorFlowLiteSelectTfOps_framework_fat_armv7_arm64.framework
I probably do not have the capacity and hardware to retry the process (was using a intel macbook, but it broke down). I hope what I did could help the community in some way.
I think we should add the flex delegate on this wonderful library. If any of the main contributors is willing to point me in right direction I could take a shot at it. This feature must be implemented at some point.
I'm having the same challenge here
Is there a solution available for this issue at the moment?
On TFLite's Android plugin, Flex delegates can be enabled by adding
implementation 'org.tensorflow:tensorflow-lite-select-tf-ops:2.12.0'
to thedependencies
withinandroid/build.gradle
, and by attaching the delegate to the interpreter's options:How can we accomplish the same on Flutter?
Thanks in advance!