google-ai-edge / mediapipe

Cross-platform, customizable ML solutions for live and streaming media.
https://ai.google.dev/edge/mediapipe
Apache License 2.0
27.59k stars 5.16k forks source link

React Native iOS App Mutex Lock SIGABRT #2153

Closed ekam123 closed 2 years ago

ekam123 commented 3 years ago

Hello,

I created an iOS framework following https://github.com/google/mediapipe/issues/1319#issue-748316438 and when I added it to a new iOS app, it worked perfectly. But when I added the framework to a newly created react native app it crashed everytime with a SIGABRT. The react native app is just a single screen that contains a button on the react side and on clicking, contains a bridging function that loads a ViewController that does iris tacking on the native side.

Screen Shot 2021-06-08 at 1 07 38 PM

This is the line that leads to the crash from IrisTracking.mm (full code below): [newGraph addFrameOutputStream:kOutputStream outputPacketType:MPPPacketTypePixelBuffer];

I tried looking inside MPPGraph.mm. CallFrameDelegate never gets called, so it's something about MakePacket or maybe the problem is inherent to React Native since it's single threaded? Any idea what can be causing this?

- (void)addFrameOutputStream:(const std::string&)outputStreamName
            outputPacketType:(MPPPacketType)packetType {
  std::string callbackInputName;
  mediapipe::tool::AddCallbackCalculator(outputStreamName, &_config, &callbackInputName,
                                       /*use_std_function=*/true);
  // No matter what ownership qualifiers are put on the pointer, NewPermanentCallback will
  // still end up with a strong pointer to MPPGraph*. That is why we use void* instead.
  void* wrapperVoid = (__bridge void*)self;
  _inputSidePackets[callbackInputName] =
      mediapipe::MakePacket<std::function<void(const mediapipe::Packet&)>>(
          [wrapperVoid, outputStreamName, packetType](const mediapipe::Packet& packet) {
            CallFrameDelegate(wrapperVoid, outputStreamName, packetType, packet);
          });
}

Build File:


ios_framework(
    name = "IrisTracker",
    hdrs = [
        "IrisTracker.h"
    ],
    infoplists = ["Info.plist"],
    bundle_id = "com.ekam.IrisTracker",
    families = ["iphone", "ipad"],
    minimum_os_version = "10.0",
    deps = [
        ":IrisTrackerLibrary",
        "@ios_opencv//:OpencvFramework",
    ],
)

objc_library(
    name = "IrisTrackerLibrary",
    srcs = [
        "IrisTracker.mm",
    ],
    hdrs = [
        "IrisTracker.h",
    ],
    copts = ["-std=c++17"],
    data = [
    "//mediapipe/graphs/iris_tracking:iris_tracking_gpu.binarypb",
    "//mediapipe/modules/face_detection:face_detection_front.tflite",
    "//mediapipe/modules/face_landmark:face_landmark.tflite",
    "//mediapipe/modules/iris_landmark:iris_landmark.tflite",
    ],
    sdk_frameworks = [
        "AVFoundation",
        "CoreGraphics",
        "CoreMedia",
        "UIKit"
    ],
    deps = [
        "//mediapipe/objc:mediapipe_framework_ios",
        "//mediapipe/objc:mediapipe_input_sources_ios",
        "//mediapipe/objc:mediapipe_layer_renderer",
    ] + select({
        "//mediapipe:ios_i386": [],
        "//mediapipe:ios_x86_64": [],
        "//conditions:default": [
            "//mediapipe/graphs/iris_tracking:iris_tracking_gpu_deps",
            "//mediapipe/framework/formats:landmark_cc_proto",
        ],
    }),
)

IrisTracker.h

#import <Foundation/Foundation.h>
#import <CoreVideo/CoreVideo.h>
@class Landmark;
@class IrisTracker;

@protocol TrackerDelegate <NSObject>
- (void)irisTracker: (IrisTracker*)irisTracker didOutputLandmarks: (NSArray<Landmark *> *)landmarks;
- (void)irisTracker: (IrisTracker*)irisTracker didOutputPixelBuffer: (CVPixelBufferRef)pixelBuffer;

@end

@interface IrisTracker : NSObject
- (instancetype)init;
- (void)startGraph;
- (void)processVideoFrame:(CVPixelBufferRef)imageBuffer;
@property (weak, nonatomic) id <TrackerDelegate> delegate;
@end

@interface Landmark: NSObject
@property(nonatomic, readonly) float x;
@property(nonatomic, readonly) float y;
@property(nonatomic, readonly) float z;
@end

IrisTracker.mm

#import "IrisTracker.h"
#import "mediapipe/objc/MPPGraph.h"
#import "mediapipe/objc/MPPCameraInputSource.h"
#import "mediapipe/objc/MPPLayerRenderer.h"
#include "mediapipe/framework/formats/landmark.pb.h"

static NSString* const kGraphName = @"iris_tracking_gpu";
static const char* kInputStream = "input_video";
static const char* kOutputStream = "output_video";
static const char* kLandmarksOutputStream = "iris_landmarks";

@interface IrisTracker() <MPPGraphDelegate>
@property(nonatomic) MPPGraph* mediapipeGraph;
@end

@interface Landmark()
- (instancetype)initWithX:(float)x y:(float)y z:(float)z;
@end

@implementation IrisTracker

#pragma mark - Cleanup methods

- (void)dealloc {
  self.mediapipeGraph.delegate = nil;
  [self.mediapipeGraph cancel];
  // Ignore errors since we're cleaning up.
  [self.mediapipeGraph closeAllInputStreamsWithError:nil];
  [self.mediapipeGraph waitUntilDoneWithError:nil];
}

#pragma mark - MediaPipe graph methods

+ (MPPGraph*)loadGraphFromResource:(NSString*)resource {
  // Load the graph config resource.
  NSLog(@"Load the graph config resource");
  NSError* configLoadError = nil;
  NSBundle* bundle = [NSBundle bundleForClass:[self class]];
  if (!resource || resource.length == 0) {
    return nil;
  }
  NSURL* graphURL = [bundle URLForResource:resource withExtension:@"binarypb"];
  NSData* data = [NSData dataWithContentsOfURL:graphURL options:0 error:&configLoadError];
  if (!data) {
    NSLog(@"Failed to load MediaPipe graph config: %@", configLoadError);
    return nil;
  }
  NSLog(@"Loaded mediapipe graph config");
  // Parse the graph config resource into mediapipe::CalculatorGraphConfig proto object.
  mediapipe::CalculatorGraphConfig config;
  config.ParseFromArray(data.bytes, data.length);
  // Create MediaPipe graph with mediapipe::CalculatorGraphConfig proto object.
  MPPGraph* newGraph = [[MPPGraph alloc] initWithGraphConfig:config];
  [newGraph addFrameOutputStream:kOutputStream outputPacketType:MPPPacketTypePixelBuffer];
  [newGraph addFrameOutputStream:kLandmarksOutputStream outputPacketType:MPPPacketTypeRaw];
  mediapipe::Packet _focal_length_side_packet = mediapipe::MakePacket<std::unique_ptr<float>>(absl::make_unique<float>(0.0));
  std::map<std::string, mediapipe::Packet> _input_side_packets = {
    {"focal_length_pixel", _focal_length_side_packet},
  };
  [newGraph addSidePackets:_input_side_packets];
  return newGraph;
}

- (instancetype)init
{
  self = [super init];
  NSLog(@"Initializing IrisTracker!!");
  if (self) {
    self.mediapipeGraph = [[self class] loadGraphFromResource:kGraphName];
    self.mediapipeGraph.delegate = self;
    // Set maxFramesInFlight to a small value to avoid memory contention for real-time processing.
    self.mediapipeGraph.maxFramesInFlight = 2;

  }
  return self;
}

- (void)startGraph {
  // Start running self.mediapipeGraph.
  NSError* error;
  if (![self.mediapipeGraph startWithError:&error]) {
    NSLog(@"Failed to start graph: %@", error);
  }
}

#pragma mark - MPPGraphDelegate methods

// Receives CVPixelBufferRef from the MediaPipe graph. Invoked on a MediaPipe worker thread.
- (void)mediapipeGraph:(MPPGraph*)graph
didOutputPixelBuffer:(CVPixelBufferRef)pixelBuffer
fromStream:(const std::string&)streamName {
  if (streamName == kOutputStream) {
    [_delegate irisTracker: self didOutputPixelBuffer: pixelBuffer];
  }
}

// Receives a raw packet from the MediaPipe graph. Invoked on a MediaPipe worker thread.
- (void)mediapipeGraph:(MPPGraph*)graph
didOutputPacket:(const ::mediapipe::Packet&)packet
fromStream:(const std::string&)streamName {
  if (streamName == kLandmarksOutputStream) {
    if (packet.IsEmpty()) {
      NSLog(@"[TS:%lld] No iris landmarks", packet.Timestamp().Value());
      return;
    }
    const auto& landmarks = packet.Get<::mediapipe::NormalizedLandmarkList>();
    NSLog(@"[TS:%lld] Number of landmarks on iris: %d", packet.Timestamp().Value(),
    landmarks.landmark_size());
    NSMutableArray<Landmark *> *result = [NSMutableArray array];
    for (int i = 0; i < landmarks.landmark_size(); ++i) {
      NSLog(@"\tLandmark[%d]: (%f, %f, %f)", i, landmarks.landmark(i).x(),
      landmarks.landmark(i).y(), landmarks.landmark(i).z());
      Landmark *landmark = [[Landmark alloc] initWithX:landmarks.landmark(i).x()
      y:landmarks.landmark(i).y()
      z:landmarks.landmark(i).z()];
      [result addObject:landmark];
    }
    [_delegate irisTracker: self didOutputLandmarks: result];
  }

}

- (void)processVideoFrame:(CVPixelBufferRef)imageBuffer {
  NSLog(@"Processing Video Frame; Iris tracker ");
  [self.mediapipeGraph sendPixelBuffer:imageBuffer
  intoStream:kInputStream
  packetType:MPPPacketTypePixelBuffer];
}

@end

@implementation Landmark

- (instancetype)initWithX:(float)x y:(float)y z:(float)z
{
  self = [super init];
  if (self) {
    _x = x;
    _y = y;
    _z = z;
  }
  return self;
}

@end
ekam123 commented 3 years ago

Hi, I was wondering if there's been any progress for this issue or a timeline for it?

robert-go commented 3 years ago

I have the same problem

alexdmiller commented 2 years ago

I have the same problem as well. I filed an issue on glog about this here: https://github.com/google/glog/issues/741#issuecomment-967556006

There is a repository that can be used to repro the issue here: https://github.com/swittk/react-native-mediapipe-facemesh

It's unclear to me if this is a bug in glog, or a bug in how mediapipe is using glog.

My current hacky fix is to modify lines of the glog source to comment out the abort() call.

sureshdagooglecom commented 2 years ago

Hi @ekam123 , Please refer this https://google.github.io/mediapipe/getting_started/android_archive_library.html

google-ml-butler[bot] commented 2 years ago

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you.

google-ml-butler[bot] commented 2 years ago

Closing as stale. Please reopen if you'd like to work on this further.

google-ml-butler[bot] commented 2 years ago

Are you satisfied with the resolution of your issue? Yes No