tensorflow / flutter-tflite

Apache License 2.0
462 stars 124 forks source link

Unable to create interpreter. #120

Open Quincy515 opened 1 year ago

Quincy515 commented 1 year ago

tflite_flutter: ^0.10.1

I trained model using ssd_moblienet_v1(or ssd_mobilenet_v2) and made tflite model.

 static Future<Interpreter> _loadModel() async {
    dev.log('Loading interpreter options...');
    final interpreterOptions = InterpreterOptions();

    // Use XNNPACK Delegate
    if (Platform.isAndroid) {
      interpreterOptions.addDelegate(XNNPackDelegate());
    }

    // Use Metal Delegate
    if (Platform.isIOS) {
      interpreterOptions.addDelegate(GpuDelegate());
    }

    dev.log('Loading interpreter...');
    return Interpreter.fromAsset(
      _modelPath,
      options: interpreterOptions..threads = 4,
    );
  }
```dart

Run with the following error

I/tflite (17536): Initialized TensorFlow Lite runtime. E/flutter (17536): [ERROR:flutter/runtime/dart_vm_initializer.cc(41)] Unhandled Exception: Invalid argument(s): Unable to create interpreter. E/flutter (17536): #0 checkArgument (package:quiver/check.dart:45:5) E/flutter (17536): #1 new Interpreter._create (package:tflite_flutter/src/interpreter.dart:58:5) E/flutter (17536): #2 new Interpreter.fromBuffer (package:tflite_flutter/src/interpreter.dart:109:37) E/flutter (17536): #3 Interpreter.fromAsset (package:tflite_flutter/src/interpreter.dart:126:24) E/flutter (17536): E/flutter (17536): #4 Detector.start (package:object_detection_tflite_flutter/service/detector_service.dart:102:7) E/flutter (17536): E/flutter (17536): I/Camera (17536): startPreviewWithImageStream

PaulTR commented 1 year ago

Does your example work with the model that's included via the install script in one of our examples, or is it model specific?

Quincy515 commented 1 year ago

Does your example work with the model that's included via the install script in one of our examples, or is it model specific?

It is possible to use the tflite model of the example, but it is not possible to train the model generated by myself, and I do not know how to solve it.

PaulTR commented 1 year ago

How are you training the model that you're using? Is it with TensorFlow Lite Model Maker, or are you creating a TensorFlow model that you convert to lite, or are you using some other tool? I'm wondering if the model process is doing something funny with it.

vankhoa01 commented 12 months ago

Hi @PaulTR, Thank you for your brilliant library.

I tried to train the model with TensorFlow and the model is working well with testing script, please see video below: https://github.com/tensorflow/flutter-tflite/assets/8953129/4e174215-aa59-49c4-b8a5-b6cdd2b1ad81

After that, I convert exported graph file into TFLite model file with script below:

Screenshot 2023-09-14 at 12 15 58 PM

Then I used tflite model with live_object_detection_ssd_mobilenet sample, and I got this issue.

Screenshot 2023-09-14 at 12 18 02 PM

Do you have any solution to fix it ? Thank you so much

BilalKashif commented 6 months ago

image

I was getting the same error. So, I tried to remove the XNNPACK delegate in case of android, and the code works after that.

// Removed this piece of code
if (Platform.isAndroid) {
      options.addDelegate(XNNPackDelegate());
}

After removing this code, my model is loading perfectly.

wildsurfer commented 3 months ago

I'm facing exactly the same issue. My model is trained using this tutorial: https://colab.research.google.com/github/EdjeElectronics/TensorFlow-Lite-Object-Detection-on-Android-and-Raspberry-Pi/blob/master/Train_TFLite2_Object_Detction_Model.ipynb

andresgd7 commented 1 month ago

Did anyone find a fix?