azihsoyn / flutter_mlkit

A Flutter plugin to use the Firebase ML Kit.
MIT License
391 stars 90 forks source link

Inception3 not supported ? #70

Open MichalMisiaszek opened 5 years ago

MichalMisiaszek commented 5 years ago

I uploaded custom tensorflowlite model which is based on InceptionV3, plugin shows error: The model is INCOMPATIBLE. It may contain unrecognized custom ops, or not FlatBuffer format: java.lang.IllegalArgumentException: Internal error: Cannot create interpreter: Didn't find op for builtin opcode 'CONV_2D' version '2'

MichalMisiaszek commented 5 years ago

It was my mistake, the options were invalid. But I fixed options and now I get error: E/ModelDownloadManager( 4245): The model is incompatible with TFLite and the app is not upgraded, do not download The model is 100% TFLLite I just uploaded it and used successfully on my computer. My options are: FirebaseModelInputOutputOptions options = FirebaseModelInputOutputOptions([ FirebaseModelIOOption(FirebaseModelDataType.BYTE, [1, 800, 800, 3]) ], [ FirebaseModelIOOption(FirebaseModelDataType.BYTE, [1, 8142]) ]); Maybe I am missing something ?

MichalMisiaszek commented 5 years ago

I dowloaded model from Firebase to be sure it was not broken on the way and it is working in Python script on my Mac but not in plugin.

azihsoyn commented 5 years ago

Thank you for reporting the detailed issue!

MichalMisiaszek commented 5 years ago

I am trying to dig into that but I don't see where this exception is coming from.

azihsoyn commented 5 years ago

I'll check this issue within a week. But it may take time.

MichalMisiaszek commented 5 years ago

Sad to hear it. Your plugin is the only I found for custom models, I will try to find the issue and fork in in worse case.

MichalMisiaszek commented 5 years ago

Did you run your example recently ? Gradle fails for me without explanation.

MichalMisiaszek commented 5 years ago

I corrected another input/output format config and I am getting.


E/CustomCompatChecker(20300): The model is INCOMPATIBLE. It may contain unrecognized custom ops, or not FlatBuffer format: java.lang.IllegalArgumentException: Internal error: Cannot create interpreter: Didn't find op for builtin opcode 'CONV_2D' version '2'
E/CustomCompatChecker(20300): Registration failed.
E/ModelResourceManager(20300): Error preloading model resource
E/ModelResourceManager(20300): com.google.firebase.ml.common.FirebaseMLException: Remote model load failed with the model options: Local model name: unspecified. Remote model name: detector.
E/ModelResourceManager(20300):  at com.google.android.gms.internal.firebase_ml.zzon.zza(Unknown Source:33)
E/ModelResourceManager(20300):  at com.google.android.gms.internal.firebase_ml.zzpr.zza(Unknown Source:109)```
MichalMisiaszek commented 5 years ago

Hi, I dig deeper and it seems a lot of people have problems with TF converted to TFLite and then used on Android. I have not seen a solution to the problem yet :(.

MichalMisiaszek commented 5 years ago

So it seems your example works fine, the issue is with more complex optimized models. I will try complex but not optimized and see what will happen. I wish the exceptions would be more informative.

azihsoyn commented 5 years ago

I'll resolve https://github.com/azihsoyn/flutter_mlkit/issues/54 in next release.

Thanks.

MichalMisiaszek commented 5 years ago

Do you know what is the problem ?

MichalMisiaszek commented 5 years ago

Hi did you managed to solve it ? The same problem directly on Android it suppose to be solved by upgrading to newer Tensorflow Lite recently released. https://github.com/tensorflow/tensorflow/issues/28163