patrick-fu / flutter_replay_kit_launcher

A flutter plugin of the launcher used to open RPSystemBroadcastPickerView for iOS
MIT License
13 stars 6 forks source link

How to get video path preview? #6

Open karamatpkid opened 1 year ago

karamatpkid commented 1 year ago

i want to get path of screen recorded video? how to get there are only two methods in channel.

patrick-fu commented 1 year ago

This plugin is just a "launcher", you need to handle the image buffer stream yourself (In the function -[handleSampleBuffer:withType:])

Note that this function will be frequently called by iOS (the ReplayKit framework), the "CMSampleBufferRef" contains the image buffer of the screen.

If you want to get "recorded video", maybe you can save the image buffer stream in a local file, or you can use a third party framework to broadcast the image stream.

patrick-fu commented 1 year ago

In addition, if you want to broadcast the screen, maybe you can try this RTC SDK 😳

karamatpkid commented 1 year ago

Hello, I am interested in developing this project using open source code. However, I am new to Objective-C and having trouble printing logs when running the code. Even when using the callback function as mentioned, I am not seeing any logs. Additionally, I am receiving the following errors:

(lldb) 2023-04-24 17:56:56.630287+0500 Runner[2036:582792] [SceneConfiguration] Info.plist contained no UIScene configuration dictionary (looking for configuration named "(no name)")
[SceneConfiguration] Info.plist contained no UIScene configuration dictionary (looking for configuration named "(no name)")
[VERBOSE-2:FlutterObservatoryPublisher.mm(97)] Failed to register observatory port with mDNS with error -65555.
[VERBOSE-2:FlutterObservatoryPublisher.mm(99)] On iOS 14+, local network broadcast in apps need to be declared in the app's Info.plist. Debug and profile Flutter apps and modules host VM services on the local network to support debugging features such as hot reload and DevTools. To make your Flutter app or module attachable and debuggable, add a '_dartobservatory._tcp' value to the 'NSBonjourServices' key in your Info.plist for the Debug/Profile configurations. For more information, see https://flutter.dev/docs/development/add-to-app/ios/project-setup#local-network-privacy-permissions
[VERBOSE-2:FlutterObservatoryPublisher.mm(97)] Failed to register observatory port with mDNS with error -65555.
[VERBOSE-2:FlutterObservatoryPublisher.mm(99)] On iOS 14+, local network broadcast in apps need to be declared in the app's Info.plist. Debug and profile Flutter apps and modules host VM services on the local network to support debugging features such as hot reload and DevTools. To make your Flutter app or module attachable and debuggable, add a '_dartobservatory._tcp' value to the 'NSBonjourServices' key in your Info.plist for the Debug/Profile configurations. For more information, see https://flutter.dev/docs/development/add-to-app/ios/project-setup#local-network-privacy-permissions
Debug service listening on ws://127.0.0.1:52219/tGz-rdxKU9c=/ws
Syncing files to device mac’s iPhone...
flutter: Launching ReplayKit screen recording UI...
patrick-fu commented 1 year ago

@karamatpkid The reason why your log is not printed is because the broadcast extension is not run in the "Runner" main process, it runs in an independent process.

If you need to debug (or check the log in console), you have to run the broadcast extension in Xcode.

image
karamatpkid commented 1 year ago

Dear Patrick Fu,

I hope this message finds you well. I am currently working on a project using your flutter_replay_kit_launcher plugin and I am trying to get the path of a screen-recorded video. Unfortunately, I have been unable to find this information through the available methods in the channel.

I understand from your previous comments that I will need to handle the image buffer stream myself using the handleSampleBuffer function, and I can save the image buffer stream to a local file to obtain the recorded video. However, I am not sure how to implement this in Objective-C.

If you could provide me with any guidance or sample code to help me accomplish this, I would greatly appreciate it. Alternatively, if there is any additional documentation or resources you could point me towards, that would be very helpful as well.

Thank you very much for your time and assistance.

Best regards, Karamat Subhani

patrick-fu commented 1 year ago

@karamatpkid Why not just ask chatGPT? 🤣 https://poe.com/s/lhOJuffXLnA0J6mMpz7X The answer looks good👍

karamatpkid commented 1 year ago

This is below code as you have mentioned. unfortunately till i am unable to view recorded video on photo,file. if any help please suggest me it will be thankful to you

//
//  ZGBroadcastManager.m
//  BroadcastDemoExtension
//
//  Created by Patrick Fu on 2020/11/5.
//

#import "ZGBroadcastManager.h"
#import <ReplayKit/ReplayKit.h>
#import <AssetsLibrary/AssetsLibrary.h>
#import <Photos/Photos.h>

#define ZG_NOTIFICATION_NAME @"ZGFinishBroadcastUploadExtensionProcessNotification"

static ZGBroadcastManager *_sharedManager = nil;

@interface ZGBroadcastManager ()

@property (nonatomic, weak) RPBroadcastSampleHandler *sampleHandler;
@property (nonatomic, strong) AVAssetWriter *assetWriter;
@property (nonatomic, strong) AVAssetWriterInput *assetWriterInput;
@property (nonatomic, strong) AVAssetWriterInputPixelBufferAdaptor *assetWriterPixelBufferAdaptor;
@end

@implementation ZGBroadcastManager

+ (instancetype)sharedManager {

    if (!_sharedManager) {
        @synchronized (self) {
            if (!_sharedManager) {
                _sharedManager = [[self alloc] init];
            }
        }
    }
    return _sharedManager;
}

- (void)startBroadcast:(RPBroadcastSampleHandler *)sampleHandler {

    self.sampleHandler = sampleHandler;

    // Add an observer for stop broadcast notification
    CFNotificationCenterAddObserver(CFNotificationCenterGetDarwinNotifyCenter(),
                                    (__bridge const void *)(self),
                                    onBroadcastFinish,
                                    (CFStringRef)ZG_NOTIFICATION_NAME,
                                    NULL,
                                    CFNotificationSuspensionBehaviorDeliverImmediately);

    // Do some business logic when starting screen capture here.

}

- (void)stopBroadcast {
    // Remove observer for stop broadcast notification
    CFNotificationCenterRemoveObserver(CFNotificationCenterGetDarwinNotifyCenter(),
                                       (__bridge const void *)(self),
                                       (CFStringRef)ZG_NOTIFICATION_NAME,
                                       NULL);

    // Do some business logic when finishing screen capture here.

    [self.assetWriter finishWritingWithCompletionHandler:^{
        if (self.assetWriter.status == AVAssetWriterStatusCompleted) {
            NSURL *videoURL = self.assetWriter.outputURL;
            [[PHPhotoLibrary sharedPhotoLibrary] performChanges:^{
                [PHAssetChangeRequest creationRequestForAssetFromVideoAtFileURL:videoURL];
            } completionHandler:^(BOOL success, NSError *error) {
                if (!success) {
                    NSLog(@"Error saving video to Photos library: %@", error.localizedDescription);
                }
            }];
        } else {
            NSLog(@"Error finishing writing asset writer");
        }
    }];
}

- (void)handleSampleBuffer:(CMSampleBufferRef)sampleBuffer withType:(RPSampleBufferType)sampleBufferType {

    // Create file output object
    NSString *outputPath = [NSHomeDirectory() stringByAppendingPathComponent:@"Documents/MyVideo.mp4"];
    NSLog(@"Error creating asset writer: %@", outputPath);
    NSURL *outputURL = [NSURL fileURLWithPath:outputPath];
    NSError *error = nil;
    self.assetWriter = [[AVAssetWriter alloc] initWithURL:outputURL fileType:AVFileTypeMPEG4 error:&error];
    if (error) {
        NSLog(@"Error creating asset writer: %@", error.localizedDescription);
        return;
    }

    // Add asset writer input
    NSDictionary *outputSettings = @{
        AVVideoCodecKey: AVVideoCodecTypeH264,
        AVVideoWidthKey: @(UIScreen.mainScreen.bounds.size.width * UIScreen.mainScreen.scale),
        AVVideoHeightKey: @(UIScreen.mainScreen.bounds.size.height * UIScreen.mainScreen.scale)
    };
    self.assetWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:outputSettings];
    self.assetWriterInput.expectsMediaDataInRealTime = YES;
    [self.assetWriter addInput:self.assetWriterInput];

    // Create pixel buffer adaptor
    NSDictionary *pixelBufferAttributes = @{
        (id)kCVPixelBufferPixelFormatTypeKey: @(kCVPixelFormatType_32BGRA),
        (id)kCVPixelBufferWidthKey: @(UIScreen.mainScreen.bounds.size.width * UIScreen.mainScreen.scale),
        (id)kCVPixelBufferHeightKey: @(UIScreen.mainScreen.bounds.size.height * UIScreen.mainScreen.scale)
    };
    self.assetWriterPixelBufferAdaptor = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:self.assetWriterInput sourcePixelBufferAttributes:pixelBufferAttributes];

    if (CMSampleBufferDataIsReady(sampleBuffer)) {
            if (self.assetWriter.status == AVAssetWriterStatusUnknown && sampleBufferType == RPSampleBufferTypeVideo) {
                CMFormatDescriptionRef formatDescription = CMSampleBufferGetFormatDescription(sampleBuffer);
                [self.assetWriter startWriting];
                [self.assetWriter startSessionAtSourceTime:CMSampleBufferGetPresentationTimeStamp(sampleBuffer)];
                if ([self.assetWriterPixelBufferAdaptor.assetWriterInput isReadyForMoreMediaData]) {
                    [self.assetWriterPixelBufferAdaptor appendPixelBuffer:CMSampleBufferGetImageBuffer(sampleBuffer) withPresentationTime:CMSampleBufferGetPresentationTimeStamp(sampleBuffer)];
                }
            } else if (self.assetWriter.status == AVAssetWriterStatusWriting && sampleBufferType == RPSampleBufferTypeVideo) {
                if ([self.assetWriterPixelBufferAdaptor.assetWriterInput isReadyForMoreMediaData]) {
                    [self.assetWriterPixelBufferAdaptor appendPixelBuffer:CMSampleBufferGetImageBuffer(sampleBuffer) withPresentationTime:CMSampleBufferGetPresentationTimeStamp(sampleBuffer)];
                }
            }
        }

}

#pragma mark - Finish broadcast function

// Handle stop broadcast notification from main app process
void onBroadcastFinish(CFNotificationCenterRef center, void *observer, CFStringRef name, const void *object, CFDictionaryRef userInfo) {
    NSString *message = @"Hello,44 world!";
    NSLog(@"My log message: %@", message);
    // Stop broadcast
    [[ZGBroadcastManager sharedManager] stopBroadcast];

    RPBroadcastSampleHandler *handler = [ZGBroadcastManager sharedManager].sampleHandler;
    if (handler) {
        // Finish broadcast extension process with no error
        #pragma clang diagnostic push
        #pragma clang diagnostic ignored "-Wnonnull"
        [handler finishBroadcastWithError:nil];
        #pragma clang diagnostic pop
    } else {
        NSLog(@"⚠️ RPBroadcastSampleHandler is null, can not stop broadcast upload extension process");
    }
}

@end