flutter-ml / google_ml_kit_flutter

A flutter plugin that implements Google's standalone ML Kit
MIT License
935 stars 715 forks source link

Isolates and Google-Ml-Kit-plugin #137

Closed om-ha closed 2 years ago

om-ha commented 2 years ago

This issue draws over https://github.com/bharat-biradar/Google-Ml-Kit-plugin/issues/86

Considering how amazing this repo is and impact of this issue on its performance, it may be a good idea to explore usage of isolates.

For example let's take a simple example with TextRecognition. You have:

  1. Camera Widget: Many FPS with high resolution according to resolution preset e.g. ResolutionPreset.medium.
  2. Text Recognition by MLKit
  3. Annotations above Camera Widget
  4. Your in-house processing of recognized text

Having all of this run on main isolate sounds scary. This issue is the only mention of isolate within this repo. Isolates are useful because they run in their own thread away from main isolate, with their own event loop, heap, and garbage collection events. This sets main/UI isolate free from being sluggish when doing resource-intensive operations.

I wish this would be discussed further.

For example, for points 2. & 4. above

2.Text Recognition by MLKit 4.Your in-house processing of recognized text

It could be argued that these can happen on different isolates. In fact 2. is occurring in platform-plugin land (native) So that's probably its own isolate. Not sure about this though. For 4., it's probably a good idea to run it on a different isolate only if there's heavy in-house operations on recognized text.

But what about 1. & 3.?

1.Camera View: Many FPS with high resolution according to resolution preset e.g. ResolutionPreset.medium. 3.Annotations above CameraView

For 1. it's actually Camera pub package, which is actually a platform plugin too. For 3., I think it's okay to create/destruct UI widgets at this high frequency? after all this is what flutter is good at with its generational garbage collector and UI rendering engine.

What are your thoughts on isolate usage? This is not just for performance gains within a feature of the 4 points above, but also to prevent any load on main UI thread/isolate.

fbernaly commented 2 years ago

@om-ha: this sounds like a great idea, could you send a PR implementing what you propose?

Otherwise we will close this one for inactivity until someone steps in to implement it.

bharat-biradar commented 2 years ago

Running the plugin on isolate would take away the flexibility. Developer can always run the functions exposed in the api on different isolate should they prefer it. Hence we are closing this issue.

fbernaly commented 2 years ago

I agree with @bharat-biradar: devs consuming this plugin can add the isolate in their app, here is a tutorial how to do it: https://blog.codemagic.io/understanding-flutter-isolates/

fbernaly commented 2 years ago

BTW, I just ran across this open source app that is using our google_mlkit_barcode_scanning plugin and they are using isolates. Here is a very good example how to do it if you are interested:

alexrabin commented 2 months ago

For anyone who is coming to this in 2024, those links above don't exist anymore. This is how I was able to implement multiple detectors with Isolates:

class IsolateAIParams {
  SendPort port;
  InputImage image;

  RootIsolateToken token;
  IsolateAIParams(
      {required this.port, required this.image, required this.token});
}

class AIResult {
  final List<DetectedObject> detectedObjects;
  final List<Face> detectedFaces;
  final List<ImageLabel> detectedLabels;
  final RecognizedText detectedText;

  AIResult({
    required this.detectedObjects,
    required this.detectedFaces,
    required this.detectedLabels,
    required this.detectedText,
  });
}

class AIService {
  static final ObjectDetector _objectDetector = ObjectDetector(
      options: ObjectDetectorOptions(
    mode: DetectionMode.stream,
    classifyObjects: true,
    multipleObjects: true,
  ));
  static final FaceDetector _faceDetector = FaceDetector(
    options: FaceDetectorOptions(
      enableContours: true,
      enableLandmarks: true,
    ),
  );
  static final ImageLabeler _imageLabeler =
      ImageLabeler(options: ImageLabelerOptions());
  static final TextRecognizer _textRecognizer = TextRecognizer();

  static Future<AIResult?> performAIRequest(InputImage image) async {
    try {
      final receivePort = ReceivePort();
      final rootToken = RootIsolateToken.instance;
      if (rootToken == null) {
        throw Exception('Root token is null');
      }
      final isolateAIParams = IsolateAIParams(
          port: receivePort.sendPort, image: image, token: rootToken);

      await Isolate.spawn(_performIsolateRequest, isolateAIParams);

      final result = await receivePort.first;

      if (result is AIResult) {
        return result;
      } else {
        return null;
      }
    } catch (e) {
      debugPrint('Error: $e');
      return null;
    }
  }

  static void _performIsolateRequest(IsolateAIParams params) async {
    BackgroundIsolateBinaryMessenger.ensureInitialized(params.token);

    final inputImage = params.image;

    try {
      final results = await Future.wait([
        _objectDetector.processImage(inputImage),
        _faceDetector.processImage(inputImage),
        _imageLabeler.processImage(inputImage),
        _textRecognizer.processImage(inputImage)
      ]);

      if (results.isEmpty) {
        throw Exception('No results');
      }

      final result = AIResult(
        detectedObjects: results[0] as List<DetectedObject>,
        detectedFaces: results[1] as List<Face>,
        detectedLabels: results[2] as List<ImageLabel>,
        detectedText: results[3] as RecognizedText,
      );

      Isolate.exit(params.port, result);
    } catch (e) {
      debugPrint('Error: $e');
      Isolate.exit(params.port, "No results From Isolate");
    }
  }
}

To call it:

final file = File('path')
final inputImage = InputImage.fromFile(file);
final aiResult = await AIService.performAIRequest(inputImage);