am15h / object_detection_flutter

Truly realtime object-detection in flutter
186 stars 114 forks source link

Object detection Example with float32 model #18

Open funwithflutter opened 3 years ago

funwithflutter commented 3 years ago

Sorry don't know what to label this issue as. I think it's more likely an error my side than something wrong with the package. Any help will be appreciated (I'm new to TensorFlow in general). Also thanks for the amazing package!

I'm trying to use this model: https://tfhub.dev/intel/lite-model/midas/v2_1_small/1/lite/1 It computes depth from an image.

And as far as I can see I'm doing all the necessary steps. I copied the code from your image classification example, and also double checked with the Android example provided in the link above (and as far as I can see I'm doing the same steps).

I'm using the tflite flutter helper package.

I'm getting a failed precondition in Quiver at the following point (when I call interpreter.run):

checkState(tfLiteTensorCopyFromBuffer(_tensor, ptr.cast(), bytes.length) ==
        TfLiteStatus.ok);

Stacktrace:

flutter: #0      checkState
package:quiver/check.dart:73
am15h/tflite_flutter_plugin#1      Tensor.setTo
package:tflite_flutter/src/tensor.dart:150
am15h/tflite_flutter_plugin#2      Interpreter.runForMultipleInputs
package:tflite_flutter/src/interpreter.dart:194
am15h/tflite_flutter_plugin#3      Interpreter.run
package:tflite_flutter/src/interpreter.dart:165
am15h/tflite_flutter_plugin#4      Classifier.predict
package:tensorflow_poc/classifier.dart:113
am15h/tflite_flutter_plugin#5      _MyHomePageState._predict
package:tensorflow_poc/main.dart:69
am15h/tflite_flutter_plugin#6      _MyHomePageState.getImage.<anonymous closure>
package:tensorflow_poc/main.dart:63
am15h/tflite_flutter_plugin#7      State.setState
package:flutter/…/widgets/framework.dart:1267
am15h/tflite_flutter_plugin#8      _MyHomePageState.getImage
package:tensorflow_poc/main.dart:57
<asynchronous suspension>

Something that also has me confused is that interpreter.getInputTensor(0).type returns TfLiteType.float32, but I expected this to be uint8 from the model description.

Below is my classifier class (I'm using this classifier in the Image Classification example from this package):

import 'dart:math';

import 'package:image/image.dart';
import 'package:collection/collection.dart';
import 'package:logger/logger.dart';
import 'package:tflite_flutter/tflite_flutter.dart';
import 'package:tflite_flutter_helper/tflite_flutter_helper.dart';

abstract class Classifier {
  Interpreter interpreter;
  InterpreterOptions _interpreterOptions;

  var logger = Logger();

  List<int> _inputShape;
  List<int> _outputShape;

  TensorImage _inputImage;
  TensorBuffer _outputBuffer;

  TfLiteType _outputType;

  String get modelName;

  NormalizeOp get preProcessNormalizeOp;

  Classifier({int numThreads}) {
    _interpreterOptions = InterpreterOptions();

    if (numThreads != null) {
      _interpreterOptions.threads = numThreads;
    }

    loadModel();
  }

  Future<void> loadModel() async {
    try {
      interpreter =
          await Interpreter.fromAsset(modelName, options: _interpreterOptions);
      print('Interpreter Created Successfully');
      _inputShape = interpreter.getInputTensor(0).shape; // {1, 256, 256, 3}
      _outputShape = interpreter.getOutputTensor(0).shape; // {1, 256, 256}
      _outputType = interpreter.getOutputTensor(0).type; // TfLiteType.float32
      print('_inputShape[0]: ${_inputShape[0]}');
      print('_inputShape[1]: ${_inputShape[1]}');
      print('_inputShape[2]: ${_inputShape[2]}');
      print('_inputShape[3]: ${_inputShape[3]}');
      print('_outputShape[0]: ${_outputShape[0]}');
      print('_outputShape[1]: ${_outputShape[1]}');
      print('_outputShape[2]: ${_outputShape[2]}');
      print('_outputType: $_outputType');
      print(
          '_intputType: ${interpreter.getInputTensor(0).type}'); // TfLiteType.float32, but expected this to be uint8
      _outputBuffer = TensorBuffer.createFixedSize(_outputShape, _outputType);
      _probabilityProcessor =
          TensorProcessorBuilder().add(postProcessNormalizeOp).build();
    } catch (e) {
      print('Unable to create interpreter, Caught Exception: ${e.toString()}');
    }
  }

  Future<void> loadLabels() async {
    labels = await FileUtil.loadLabels(_labelsFileName);
    if (labels.length == _labelsLength) {
      print('Labels loaded successfully');
    } else {
      print('Unable to load labels');
    }
  }

  TensorImage _preProcess() {
    int cropSize = min(_inputImage.height, _inputImage.width);
    return ImageProcessorBuilder()
        .add(ResizeWithCropOrPadOp(cropSize, cropSize))
        .add(ResizeOp(
            _inputShape[1], _inputShape[2], ResizeMethod.NEAREST_NEIGHBOUR))
        .add(preProcessNormalizeOp)
        .build()
        .process(_inputImage);
  }

  void predict(Image image) {
    try {
      if (interpreter == null) {
        throw StateError('Cannot run inference, Intrepreter is null');
      }
      final pres = DateTime.now().millisecondsSinceEpoch;
      _inputImage = TensorImage.fromImage(image);
      print('input image data type: ${_inputImage.dataType}');
      _inputImage = _preProcess();
      print('input image width: ${_inputImage.width}');
      print('input image height: ${_inputImage.height}');
      print('input image data type: ${_inputImage.dataType}');
      final pre = DateTime.now().millisecondsSinceEpoch - pres;
      print('Time to load image: $pre ms');
      print('input buffer: ${_inputImage.buffer}');
      print('output buffer: ${_outputBuffer.getBuffer()}');
      final runs = DateTime.now().millisecondsSinceEpoch;

      interpreter.run(_inputImage.buffer, _outputBuffer.buffer); // THROWS
      final run = DateTime.now().millisecondsSinceEpoch - runs;

      print('Time to run inference: $run ms');

      print(_outputBuffer.getDoubleList());
    } catch (e, st) {
      logger.e('error', e, st);
      print(st);
    }
  }

  void close() {
    if (interpreter != null) {
      interpreter.close();
    }
  }
}

And implementation class:

import 'package:tensorflow_poc/classifier.dart';
import 'package:tflite_flutter_helper/tflite_flutter_helper.dart';

class ClassifierQuant extends Classifier {
  ClassifierQuant({int numThreads: 1}) : super(numThreads: numThreads);

  @override
  String get modelName => 'lite-model_midas_v2_1_small_1_lite_1.tflite';

  @override
  NormalizeOp get preProcessNormalizeOp => NormalizeOp(0, 1);
}
am15h commented 3 years ago

Can you share you console log corresponding to your print statements? This looks like a input/output format mismatch issue, I would be able to help better with some debug output. Thanks.

funwithflutter commented 3 years ago

Thanks. All of the print output:

Restarted application in 626ms.
flutter: Interpreter Created Successfully
flutter: _inputShape[0]: 1
flutter: _inputShape[1]: 256
flutter: _inputShape[2]: 256
flutter: _inputShape[3]: 3
flutter: _outputShape[0]: 1
flutter: _outputShape[1]: 256
flutter: _outputShape[2]: 256
flutter: _outputType: TfLiteType.float32
flutter: _intputType: TfLiteType.float32
image_picker: compressing is not supported for type (null). Returning the image with original quality
flutter: input image data type: TfLiteType.uint8
flutter: crop size: 3024
flutter: input image width: 256
flutter: input image height: 256
flutter: input image data type: TfLiteType.uint8
flutter: Time to load image: 66 ms
flutter: input buffer: Instance of '_ByteBuffer'
flutter: output buffer: Instance of '_ByteBuffer'
flutter: \^[[38;5;196m┌───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────<…>
flutter: \^[[38;5;196m│ \^[[0m\^[[39m\^[[48;5;196mBad state: failed precondition<…>
flutter: \^[[38;5;196m├┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄<…>
flutter: \^[[38;5;196m│ #0   checkState (package:quiver/check.dart:73:<…>
flutter: \^[[38;5;196m│ am15h/tflite_flutter_plugin#1   Tensor.setTo (package:tflite_flutter/src/tensor.dart:150:<…>
flutter: \^[[38;5;196m│ am15h/tflite_flutter_plugin#2   Interpreter.runForMultipleInputs (package:tflite_flutter/src/interpreter.dart:194:3<…>
flutter: \^[[38;5;196m│ am15h/tflite_flutter_plugin#3   Interpreter.run (package:tflite_flutter/src/interpreter.dart:165:<…>
flutter: \^[[38;5;196m│ am15h/tflite_flutter_plugin#4   Classifier.predict (package:tensorflow_poc/classifier.dart:113:1<…>
flutter: \^[[38;5;196m│ am15h/tflite_flutter_plugin#5   _MyHomePageState._predict (package:tensorflow_poc/main.dart:69:2<…>
flutter: \^[[38;5;196m│ am15h/tflite_flutter_plugin#6   _MyHomePageState.getImage.<anonymous closure> (package:tensorflow_poc/main.dart:63:<…>
flutter: \^[[38;5;196m│ am15h/tflite_flutter_plugin#7   State.setState (package:flutter/src/widgets/framework.dart:1267:3<…>
flutter: \^[[38;5;196m├┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄<…>
flutter: \^[[38;5;196m│ ⛔ error<…>
flutter: \^[[38;5;196m└───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────<…>
flutter: #0      checkState
package:quiver/check.dart:73
am15h/tflite_flutter_plugin#1      Tensor.setTo
package:tflite_flutter/src/tensor.dart:150
am15h/tflite_flutter_plugin#2      Interpreter.runForMultipleInputs
package:tflite_flutter/src/interpreter.dart:194
am15h/tflite_flutter_plugin#3      Interpreter.run
package:tflite_flutter/src/interpreter.dart:165
am15h/tflite_flutter_plugin#4      Classifier.predict
package:tensorflow_poc/classifier.dart:113
am15h/tflite_flutter_plugin#5      _MyHomePageState._predict
package:tensorflow_poc/main.dart:69
am15h/tflite_flutter_plugin#6      _MyHomePageState.getImage.<anonymous closure>
package:tensorflow_poc/main.dart:63
am15h/tflite_flutter_plugin#7      State.setState
package:flutter/…/widgets/framework.dart:1267
am15h/tflite_flutter_plugin#8      _MyHomePageState.getImage
package:tensorflow_poc/main.dart:57
<asynchronous suspension>
Larvouu commented 3 years ago

Hello,

I am also new to TensorFlow, and sorry for my english, I'm from France, but I am probably facing the same issue than you @funwithflutter.

I have build my own custom model from scratch (with this tutorial on YouTube), with my own dataset. When I try to convert it into TensorFlow Lit format, I end up having a float32[1, 320, 320, 3] input type, which seems to be the "standard" input type.

I used netron.app to visualize the differences between @am15h model (which is similar to the official one, given by TensorFlow) and mine.

@am15h, just like the official model from TensorFlow, are using the "quantization" process (explained here), in order to reduce the model size (as far as I could understand). One of the consequences is that the input type is modified to uint8.

In my case, I do not want to use quantization, I would like to keep my float32 input type, and still be able to perform object detection on a flutter app.

@am15h would you have any resources (online tutorial, github repo or anything else) to perform object detection with a flutter app, without the quantization optimisation ? Or to be able to use your package with an input type float32[1, size, size, 3] ?

Thank you ! 🙂

Stacktrace from AndroidStudio with float32 input type custom model :

I/Choreographer(20846): Skipped 41 frames!  The application may be doing too much work on its main thread.
I/CameraManagerGlobal(20846): Camera 0 facing CAMERA_FACING_BACK state now CAMERA_STATE_ACTIVE for client com.tiko.poc_aquarium API Level 2
E/flutter (20846): [ERROR:flutter/runtime/dart_isolate.cc(882)] Unhandled exception:
E/flutter (20846): Bad state: failed precondition
E/flutter (20846): #0      checkState (package:quiver/check.dart:73:5)
E/flutter (20846): am15h/tflite_flutter_plugin#1      Tensor.setTo (package:tflite_flutter/src/tensor.dart:150:5)
E/flutter (20846): am15h/tflite_flutter_plugin#2      Interpreter.runForMultipleInputs (package:tflite_flutter/src/interpreter.dart:194:33)
E/flutter (20846): am15h/tflite_flutter_plugin#3      Classifier.predict (package:poc_aquarium/tflite/classifier.dart:143:18)
E/flutter (20846): am15h/tflite_flutter_plugin#4      IsolateUtils.entryPoint (package:poc_aquarium/utils/cameraModule/isolateUtils.dart:45:51)
E/flutter (20846): am15h/tflite_flutter_plugin#5      _RootZone.runUnary (dart:async/zone.dart:1450:54)
E/flutter (20846): am15h/tflite_flutter_plugin#6      _FutureListener.handleValue (dart:async/future_impl.dart:143:18)
E/flutter (20846): am15h/tflite_flutter_plugin#7      Future._propagateToListeners.handleValueCallback (dart:async/future_impl.dart:696:45)
E/flutter (20846): am15h/tflite_flutter_plugin#8      Future._propagateToListeners (dart:async/future_impl.dart:725:32)
E/flutter (20846): am15h/tflite_flutter_plugin#9      Future._complete (dart:async/future_impl.dart:519:7)
E/flutter (20846): am15h/tflite_flutter_plugin#10     _StreamIterator._onData (dart:async/stream_impl.dart:1070:20)
E/flutter (20846): am15h/tflite_flutter_plugin#11     _RootZone.runUnaryGuarded (dart:async/zone.dart:1384:10)
E/flutter (20846): am15h/tflite_flutter_plugin#12     _BufferingStreamSubscription._sendData (dart:async/stream_impl.dart:357:11)
E/flutter (20846): am15h/tflite_flutter_plugin#13     _BufferingStreamSubscription._add (dart:async/stream_impl.dart:285:7)
E/flutter (20846): am15h/tflite_flutter_plugin#14     _SyncStreamControllerDispatch._sendData (dart:async/stream_controller.dart:808:19)
E/flutter (20846): am15h/tflite_flutter_plugin#15     _StreamController._add (dart:async/stream_controller.dart:682:7)
E/flutter (20846): am15h/tflite_flutter_plugin#16     _StreamController.add (dart:async/stream_controller.dart:624:5)
E/flutter (20846): am15h/tflite_flutter_plugin#17     _RawReceivePortImpl._handleMessage (dart:isolate-patch/isolate_patch.dart:168:12)
I/ko.poc_aquariu(20846): Background young concurrent copying GC freed 67(61KB) AllocSpace objects, 2(2708KB) LOS objects, 0% free, 8636KB/8636KB, paused 5.631ms total 16.127ms

EDIT :

Solved this by using flutter_tflite thanks to the example app

am15h commented 3 years ago

Hi, I am very sorry for late reply missed these notifications, I would suggest to use this implementation class instead of the ClassifierQuant https://github.com/am15h/tflite_flutter_helper/blob/master/example/image_classification/lib/classifier_float.dart, The image classification example app shows dealing with both float and quant models.

Let me know if you can get it working by doing this. Please feel free to ask if you need more help with this.

Larvouu commented 3 years ago

Thank you for this ! would you have an example for float models with real time object detection ?

Larvouu commented 3 years ago

@am15h any example with de object detection part please ? Working with float32 input model

am15h commented 3 years ago

I am transferring this issue to https://github.com/am15h/object_detection_flutter.