rFlex / SCRecorder

iOS camera engine with Vine-like tap to record, animatable filters, slow motion, segments editing
Apache License 2.0
3.06k stars 582 forks source link

Live filter in the record scene, rather than select the filter after recored #182

Open fayhot opened 9 years ago

fayhot commented 9 years ago

I got another problem. I try to have a live filter when record in the Record View, rather than a recored filter in the post deal view. So I did set the CIImageRenderer (as commented, /* If set, this render will receive every received frames as CIImage. Can be useful for displaying a real time filter for example. /).

The way i set the CIImageRenderer is the same as the demo , however, there's no effect.

Is there any help ? thanks

anthonycastelli commented 9 years ago

If you figure this out, let me know! I've been trying to come up with a solution to this as well

fayhot commented 9 years ago

@anthonycastelli
I got the reason. All the things go right except the last one.

As the camera go on live, the delegate method are triggered. however, in the last step, the setNeedsDisplay failed to trigger the - (void)drawRect:(CGRect)rect method of SCImageView. So I trigger it by hand in the method, which is - (void)setImageBySampleBuffer:(CMSampleBufferRef)sampleBuffer, there's still no response. Out of my expected.

I think it's must be some reason of the lower api, however , i have no idea about that. So I tried another way. Which is dirty, but really works. explain: never render the preview, but just set a same frame uiimageview above the preview. as the preview buffer changes, reset the uiimageview.

  1. create a new object that implement the CIImageRenderer.(Just like SCImageView is OK)
  2. override the - (void)setImageBySampleBuffer:(CMSampleBufferRef)sampleBuffer function.
  3. set a delegate
  4. as setImageBySampleBuffer are trigger , you can reset the above uiimageview by a delegate method.

the following is a demo capture. I'm new to objc, I'll manage the code to add some selector.

image_1437038910 721773

anthonycastelli commented 9 years ago

That might work. Im sure there are better ways to accomplish this. Snapchat does live filters as well as Geofilters and I am pretty sure they aren't using a UIImageView and setting a CIImage 30/60 times a second, whatever the preview is. @rFlex Do you have any suggestions or ideas for this one?

rFlex commented 9 years ago

Setting a CIImageRenderer inside SCRecorder will not change the buffer that are actually recorded. It will only make the CIImageRenderer as receiver of the image buffers from the camera, so you can display a live filter. If you want the filters to be actually applied to the file, you will need to set the filter inside SCVideoConfiguration (which handle the output video configuration). Was that your question?

fayhot commented 9 years ago

Oh, great. Thanks for your answer, that hits. By the way, "you can display a live filter", is it possible to be done just by the library, without much more stuff code

xportation commented 9 years ago

@fayhot I’m using a SCSwipeableFilterView widget. My setup is:

self.filterSwitcherView.refreshAutomaticallyWhenScrolling = FALSE;
self.filterSwitcherView.contentMode = UIViewContentModeScaleAspectFill;

self.filterSwitcherView.filters = @[
                                            [SCFilter emptyFilter],
                                            [SCFilter filterWithCIFilterName:@"CIPhotoEffectChrome"],
                                            [SCFilter filterWithCIFilterName:@"CIPhotoEffectInstant"],
                                            [SCFilter filterWithCIFilterName:@"CIPhotoEffectTonal"]
                                            ];
recorder.CIImageRenderer = self.filterSwitcherView;

This is not working good, it’s slow. Looks a delay while rendering. I don’t know what’s happening.

xezero commented 9 years ago

@fayhot Turns out, you just simply need to do _filterSwitcherView.CIImage = [CIImage imageWithColor:[CIColor colorWithRed:0 green:0 blue:0]]; (could be any arbitrary color) in viewDidLoad of the view controller that has the SCSwipeableFilterView. This will ultimately call the private method _loadContext which sets up the essential EAGLContext and CIContext for the internal GLKView to actually start rendering the filtered sample buffers.

fayhot commented 9 years ago

@xezero, you're right. And I got the result today. And It's the same. Actual i read all the source code for 3 days.

You're great.

fayhot commented 9 years ago

@xportation

Thanks a lot. Your code makes sense.

I've read the SCRecorder source code, there's no better way than yours.

xezero commented 9 years ago

@fayhot

Thanks! Glad it worked for you! :)

hkbenchan commented 9 years ago

@xezero @fayhot How do you resolve the problem?

I setup in viewDidLoad

// @IBOutlet weak var previewView: SCImageView!

let _randomCIImage = CIImage(color: CIColor(red: 0, green: 0, blue: 0))

// CIImage has infinite extent, crop back to normal size
let cropFilter = CIFilter(name: "CICrop", withInputParameters: [ "inputImage": _randomCIImage, "inputRectangle": CIVector(CGRect: CGRectMake(0, 0, UIScreen.mainScreen().bounds.size.width, UIScreen.mainScreen().bounds.size.height)) ])

self.previewView.CIImage = cropFilter?.outputImage

and in later somewhere (after viewDidAppear)

// var recorder: SCRecorder!
self.recorder.previewView = self.previewView

self.previewView.filter = self.processTheme?.filter()?.scFilter // some SCFilter which is not null

What I see is the correct camera view with no filter on it, the drawRect is not called in every FPS, it is called only on setFilter:(SCFilter *)filter

any ideas?

I have tried to assign the SCImageView to recorder.CIImageRenderer, which results in a filtered view but > 1000 ms lags in the camera feed (even in iPhone 6)

However, the lag does not happen on iOS8 somehow, it appears on iOS9 suddenly.

@rFlex Do you have idea on this?

fayhot commented 9 years ago

About Q1 Maybe something wrong with your preview, that must be implement the CIImageRenderer. Use SCImageView, SCSwipeableFilterView, or any custom uiview that implement the above protocol. As you say, i guess you made a test. However, you make a wrong usage.

self.recorder.CIImageRender = id<CIImageRenderer>

Certainly, self.previewView.CIImage = cropFilter?.outputImage go wrong way.

Unless you want make a realtime Comparison, there's no need to set the recorder preview. self.recorder.CIImageRender = id<CIImageRenderer> has actually send the raw image data to the render view, been redrawn in the GLKView. Once you have add the render view in the controller root view, the filtered image will came out.

About Q2 It's a extra problem of the Q1. As the core image has a bed speed, you should be careful of the CIFIlter you choose. CIFilter cost within 20ms are acceptable. And you can custom the filter with open GL. Of course, the apple make some rule and lots of limit.

xezero commented 9 years ago

It's because the context is actually never initialized. It's not documented and I had to dig through the code to realize it. Try creating a CIImage from an arbitrary CIColor and setting that to the swipeable filter view's CIImage property in viewDidLoad:

swipeableFilterView.CIImage = [CIImage imageWithColor:[CIColor colorWithRed:0 green:0 blue:0 alpha:1.0]];

drawRect should then be called once you've done that. You also don't want to set a preview view if you're using a CIImageRenderer.

On Sep 23, 2015, at 2:55 AM, ustbenchan notifications@github.com wrote:

@xezero @fayhot How do you resolve the problem?

I setup in viewDidLoad

// @IBOutlet weak var previewView: SCImageView!

let _randomCIImage = CIImage(color: CIColor(red: 0, green: 0, blue: 0))

// CIImage has infinite extent, crop back to normal size let cropFilter = CIFilter(name: "CICrop", withInputParameters: [ "inputImage": _randomCIImage, "inputRectangle": CIVector(CGRect: CGRectMake(0, 0, UIScreen.mainScreen().bounds.size.width, UIScreen.mainScreen().bounds.size.height)) ])

self.previewView.CIImage = cropFilter?.outputImage and in later somewhere (after viewDidAppear)

// var recorder: SCRecorder! self.recorder.previewView = self.previewView

self.previewView.filter = self.processTheme?.filter()?.scFilter // some SCFilter which is not null What I see is the correct camera view with no filter on it, the drawRect is not called in every FPS, it is called only on setFilter:(SCFilter *)filter

any ideas?

I have tried to assign the SCImageView to recorder.CIImageRenderer, which results in a filtered view but > 1000 ms lags in the camera feed (even in iPhone 6)

However, the lag does not happen on iOS8 somehow, it appears on iOS9 suddenly.

@rFlex Do you have idea on this?

— Reply to this email directly or view it on GitHub.

hkbenchan commented 9 years ago

@fayhot

Like @xezero said, the reason I put the CIImage thing in the viewDidLoad, it's because the context is never initialized unless a solid CIImage is passed in.

Actually, I did not set the preview view if I use the CIImageRenderer, I directly set the CIImageRenderer of SCRecorder to my SCImageView

What's being weird is the same code running smoothly in iOS8, suddenly becomes very lag on iOS9.

In iOS8, the CIImageRenderer can receive correct image frame in each FPS, while in iOS9, the CIImageRenderer receive image frame that are at least 1000ms ago. So I am wondering what makes the difference.

xezero commented 9 years ago

I'm noticing the same thing in iOS 9.0. Trying to figure out whats up.. Will update if I do!

On Sep 27, 2015, at 10:38 PM, ustbenchan notifications@github.com wrote:

@fayhot

Like @xezero said, the reason I put the CIImage thing in the viewDidLoad, it's because the context is never initialized unless a solid CIImage is passed in.

Actually, I did not set the preview view if I use the CIImageRenderer, I directly set the CIImageRenderer of SCRecorder to my SCImageView

What's being weird is the same code running smoothly in iOS8, suddenly becomes very lag on iOS9.

In iOS8, the CIImageRenderer can receive correct image frame in each FPS, while in iOS9, the CIImageRenderer receive image frame that are at least 1000ms ago. So I am wondering what makes the difference.

— Reply to this email directly or view it on GitHub.

fayhot commented 9 years ago

@ustbenchan More params are needed. The platform and the system version in detail. There's some terrible bugs on ios9. The CoreImage take much more memory. I wrote some opengl filters like bilateral shader take more than 100ms.

However, the basic filters supply by apple CoreImage wrapped as CIFilter take little , it's about 20ms. There must be something wrong with your usage

jajhar commented 9 years ago

I'm experiencing the same issues listed above. A considerable amount of lag from setting the CIImage of the filter view. After removing these lines from viewDidLoad, the video records smoothly.

CIImage *randomImage = [CIImage imageWithColor:[CIColor colorWithRed:0 green:0 blue:0]];
CIFilter *cropFilter = [CIFilter filterWithName:@"CICrop" withInputParameters:@{@"inputImage": randomImage,
                                                                                    @"inputRectangle": [CIVector vectorWithCGRect:self.view.bounds]
                                                                                    }];
_filterView.CIImage = cropFilter.outputImage;

Reproduced on an iPhone 6 (iOS 9) Xcode 7.0

hkbenchan commented 9 years ago

Let me prepared a simple project to demo this problem.

Will finish wrapping up in these two or three days.

xezero commented 9 years ago

@jajhar Try just doing _filterView.CIImage = [CIImage imageWithColor:[CIColor colorWithRed:0 green:0 blue:0]];

@ustbenchan Try turning off video stabilization in SCRecorder.m, Line 979 to see if that fixes your latency issues: videoConnection.preferredVideoStabilizationMode = AVCaptureVideoStabilizationModeOff; videoConnection.enablesVideoStabilizationWhenAvailable = NO;

jajhar commented 9 years ago

@xezero the second change worked for me. Setting these two:

videoConnection.preferredVideoStabilizationMode = AVCaptureVideoStabilizationModeOff; videoConnection.enablesVideoStabilizationWhenAvailable = NO;
xezero commented 9 years ago

@jajhar Awesome! :)

hkbenchan commented 9 years ago

@xezero This works! Not sure why this makes the latency thing appeared on iOS 9. And I would prefer not to directly modifying the library as we import as a Cocoapods dependency.

@rFlex Actually is it possible to expose the videoConnection thing? Or at least give us a block to configure both Video and Audio input before startRunning?

In case someone is looking for a temporary fix, here is a swift version


import Foundation
import AVFoundation

extension SCRecorder {

  private func _videoConnection() -> AVCaptureConnection? {

    if let _outputs = self.captureSession?.outputs {

      for output in _outputs {
        if let _captureOutput = output as? AVCaptureVideoDataOutput {

          for connection in _captureOutput.connections {
            if let captureConnection = connection as? AVCaptureConnection {

              for port in captureConnection.inputPorts {
                if let _port = port as? AVCaptureInputPort {
                  if _port.mediaType == AVMediaTypeVideo {
                    return captureConnection
                  }
                }
              }
            }
          }
        }
      }
    }

    return nil

  }

  func attemptTurnOffVideoStabilization() {

    self.beginConfiguration()

    let videoConnection = self._videoConnection()
    if let connection = videoConnection {

      if connection.supportsVideoStabilization {
        connection.preferredVideoStabilizationMode = AVCaptureVideoStabilizationMode.Off
      }

    }

    self.commitConfiguration()

  }

}

Just call

recorder.attemptTurnOffVideoStabilization()

after

recorder.startRunning()
xezero commented 9 years ago

@ustbenchan I like your extension! I did something similar to turn it off:

- (void)reconfigureVideoConnection {
    // We'll disable video stabilization for now so we don't get any latency
    // We also need to reconfigure this video connection everytime the device is initalized or changed (i.e. front -> back -> front)
    for (AVCaptureConnection * connection in _recorder.videoOutput.connections) {
        for (AVCaptureInputPort * port in connection.inputPorts) {
            if ([port.mediaType isEqual:AVMediaTypeVideo]) {
                if (connection.isVideoStabilizationSupported) {
                    connection.preferredVideoStabilizationMode = AVCaptureVideoStabilizationModeOff;

                    #pragma clang diagnostic push
                    #pragma clang diagnostic ignored "-Wdeprecated-declarations"
                    connection.enablesVideoStabilizationWhenAvailable = NO;
                    #pragma clang diagnostic pop
                }

                return;
            }
        }
    }
}

One thing to note is that if you were to switch camera devices (i.e. front to back) while the session is still active, you will need to reconfigure the video connection as it gets reset.

xezero commented 9 years ago

Opened up an issue for the latency with video stabilization as well: https://github.com/rFlex/SCRecorder/issues/217

dimohamdy commented 8 years ago

what the last solution?

cerupcat commented 8 years ago

Does anyone have an example of displaying AND rendering/saving a live filter? I don't see anything in this thread or in the examples that actually demonstrate how to apply a live filter (one that you see while recording) and save the filter on export.

If I add a filter to the videoConfiguration, it doesn't display on the preview and when it's saved, it's just a black screen. Is doing live filters possible? The answer isn't clear.

renjithn commented 8 years ago

@cerupcat SCRecorder has a SCImageView property (It used to be CIImageRenderer before) which is what you are looking for. I guess wherever you see CIImageRenderer in this thread you read it as SCImageView

cerupcat commented 8 years ago

Thanks @renjithn, however following examples above, I still can't get any live filter to show up. I'm currently trying with the following code, but it just shows the regular input of the camera. The filter is also not applied to the output. I'm surprised there's no demo of live filters in the example project.

    SCFilter *blurFilter = [SCFilter filterWithCIFilterName:@"CIGaussianBlur"];
    [blurFilter.CIFilter setValue:[NSNumber numberWithFloat:100] forKey:kCIInputRadiusKey];

    self.filterSwitcherView = [[SCSwipeableFilterView alloc] initWithFrame:previewView.frame];
    self.filterSwitcherView.refreshAutomaticallyWhenScrolling = FALSE;
    self.filterSwitcherView.contentMode = UIViewContentModeScaleAspectFill;
    self.filterSwitcherView.filters = @[
                                      blurFilter
                                        ];
    _recorder.SCImageView = self.filterSwitcherView;
renjithn commented 8 years ago

@cerupcat In the above I don't see self.filterSwitcherView being added to any view. May be thats whats missing. i.e [self.view addSubView:self.filterSwitcherView]; or better you could have it in the storyboard.

The above would only "preview" the filters. For exporting you will need to set the selected filter in SCVideoConfiguration so that it will reflect in the output.

cerupcat commented 8 years ago

Thanks @renjithn. If I add filterSwitcherView as a subview, it's just a black screen. If I add it to the SCVideoConfiguration it's also just a black screen on output. I've only tried the blur filter so far though, so I'm not sure if that's the issue or not.

renjithn commented 8 years ago

Ok. I also see some issues with this latest version w.r.t real time rendering that I'm not entirely sure how to solve. setImageBySampleBuffer method in the SCImageView doesn't do anything right now during the rendering.

Update: Try adding [self loadContextIfNeeded]; before the line [self setNeedsDisplay] in the method setImageBySampleBuffer because its looks like the context is never initiated and is nil.

cerupcat commented 8 years ago

The [self loadContextIfNeeded seemed to fix the video output. The recorded video is still black unfortunately though. Looks like there's a few things that have to be worked out.

For now, since I'm looking to do a blur filter and it seems it may be too slow (frame rate is really bad) when doing this, I might just use a UIVisualEffectView over the video for now (since it works at full frame rate) and i'll render the blur on export.

dimohamdy commented 8 years ago

@cerupcat can you add full code

cerupcat commented 8 years ago

Here's the main code. The blur is applied to the preview (although it has a blackish border - which doesn't happen when you blur on export which is odd) and the recorded output is black. In my case, I also think blurring live video like this isn't meant for realtime (as the frame rate drops significantly).

    _recorder = [SCRecorder recorder];
    _recorder.captureSessionPreset = [SCRecorderTools bestCaptureSessionPresetCompatibleWithAllDevices];
    _recorder.maxRecordDuration = CMTimeMake(30 * 15, 30);
    _recorder.frameRate = 30;
    _recorder.fastRecordMethodEnabled = NO;
    _recorder.delegate = self;
    _recorder.autoSetVideoOrientation = YES;

    // Get the video configuration object
    SCVideoConfiguration *video = _recorder.videoConfiguration;

    // Whether the video should be enabled or not
    video.enabled = YES;
    // The bitrate of the video video
    video.bitrate = 2000000; // 2Mbit/s
    // Size of the video output
    video.size = CGSizeMake(640, 640);
    // Scaling if the output aspect ratio is different than the output one
    video.scalingMode = AVVideoScalingModeResizeAspectFill;
    video.codec = AVVideoCodecH264;

    SCAudioConfiguration *audio = _recorder.audioConfiguration;
    audio.channelsCount = 1;

    SCFilter *blurFilter = [SCFilter filterWithCIFilterName:@"CIGaussianBlur"];
    [blurFilter.CIFilter setValue:[NSNumber numberWithFloat:100] forKey:kCIInputRadiusKey];

    video.filter = blurFilter;

    UIView *previewView = self.previewView;
    _recorder.previewView = previewView;

    previewView.frame = CGRectMake(0, 0, [UIScreen mainScreen].bounds.size.width, [UIScreen mainScreen].bounds.size.width);

    self.scimageview = [[SCFilterImageView alloc] initWithFrame:[UIScreen mainScreen].bounds];
    self.scimageview.filter = blurFilter;

    _recorder.SCImageView = self.scimageview;
    [previewView addSubview:self.scimageview]; 
renjithn commented 8 years ago

@cerupcat Understood on the frame rate. Black frames on export may be due to the fact that the CIContext that is supposed to render the o/p not initialised in the SCRecorder.m. You might have to dig deep here. Probably look at this line

_context = [SCContext new].CIContext

and replace it with something like

_context = [SCContext contextWithType:SCContextTypeAuto options:nil].CIContext;

MMasterson commented 8 years ago

@renjithn this fixed my black frames issue

MMasterson commented 8 years ago

Has anyone figured out a solution for this?

I am using SCSwipeableFilterView and it's not reacting to any swipe, as well as not filling with any filters - it's just white (the view base color)

I've read through this entire post with no luck...

below is my code my setting up the recorder.

    // Create the recorder
    _recorder = [SCRecorder recorder];

    SCPhotoConfiguration *photo = [_recorder photoConfiguration];

    [photo setEnabled:YES];

    //[_recorder continuousFocusAtPoint:CGPointMake(.5f, .5f)];
    _recorder.captureSessionPreset = [SCRecorderTools bestCaptureSessionPresetCompatibleWithAllDevices];

    // Configuring the recorder
    SCVideoConfiguration *video = [_recorder videoConfiguration];

    // Whether the video should be enabled or not
    [video setEnabled:YES];
    //[video setPreset:AVCaptureSessionPresetPhoto];

    // The bitrate of the video video
    [video setBitrate:880000]; // 0.8Mbit/s

    // Size of the video output
    [video setSize:CGSizeMake(480, 640)];

    //
    [video setMaxFrameRate:60];

    // Scaling if the output aspect ratio is different than the output one
    [video setScalingMode:AVVideoScalingModeResizeAspectFill];

 //   [video setFilter:SCFIlter]
    // The timescale ratio to use. Higher than 1 makes a slow motion, between 0 and 1 makes a timelapse effect
    [video setTimeScale:1.0f];

    // Whether the output video size should be infered so it creates a square video
    [video setSizeAsSquare:NO];

    // Get the audio configuration object
    SCAudioConfiguration *audio = [_recorder audioConfiguration];

    // Whether the audio should be enabled or not
    [audio setEnabled:YES];

    // the bitrate of the audio output
    [audio setBitrate:128000]; // 128kbit/s

    // Number of audio output channels
    [audio setChannelsCount:1]; // Mono output

    // The sample rate of the audio output
    [audio setSampleRate:0]; // Use same input

    // The format of the audio output
    [audio setFormat:kAudioFormatMPEG4AAC]; // AAC

    // Configuring the recorder
    [_recorder setMaxRecordDuration:CMTimeMake(15, 1)];
    [_recorder setDelegate:self];

    // Prepare the recorder for use

  self.FilterView.refreshAutomaticallyWhenScrolling = FALSE;
    self.FilterView.contentMode = UIViewContentModeScaleAspectFill;

    self.FilterView.filters = @[
                                [SCFilter filterWithCIFilterName:@"CIPhotoEffectChrome"],
                                [SCFilter filterWithCIFilterName:@"CIPhotoEffectInstant"],
                                [SCFilter filterWithCIFilterName:@"CIPhotoEffectTonal"]
                                ];

    _recorder.SCImageView = self.FilterView;

    [_recorder setPreviewView:[self previewView]];
mitchellporter commented 8 years ago

@renjithn That solved my black frames issue, thank you!

mitchellporter commented 8 years ago

The black screen issue is fixed for me about 99% of the time on most devices. However, it happens about 10-30% of the time when being used on an iPhone 6 Plus.

It seems to happen more when you are running the app without Xcode/the debugger attached.

mqaiserbutt commented 8 years ago

@renjithn It solved my issue of black frame too.

drogojinaru commented 7 years ago

can somebody post a FULL example code with PreviewFilter view, trying for 3 days and nothing works....

levantAJ commented 7 years ago

I just collected you guys information above and give a solution, it worked for me!

  1. Create your filter:

    fileprivate lazy var filter: SCFilter = SCFilter(ciFilterName: "CIPhotoEffectInstant")
  2. Create your filter view:

    fileprivate lazy var filterView: SCFilterImageView = SCFilterImageView(frame: CGRect(origin: .zero, size: .screen))
  3. Set filter for filterView:

    filterView.filter = filter
    add(subview: filterView)

Note: If wrong mirrored direction:

filterView.preferredCIImageTransform = CGAffineTransform(scaleX: -1, y: 1)
  1. Setup filter for your recorder:

    recorder.videoConfiguration.filter = filter //Affect your output
    recorder.scImageView = filterView //Live filtering 
  2. Update SCImageView.m: Go to method: - (void)setImageBySampleBuffer:(CMSampleBufferRef)sampleBuffer Add [self loadContextIfNeeded]; before: [self setNeedsDisplay];

  3. Update SCRecorder.m Go to method: - (id)init

Replace:

_context = [SCContext new].CIContext

by

_context = [SCContext contextWithType:SCContextTypeAuto options:nil].CIContext;
  1. If you guys are using CocoaPods, please skip update SCRecorder
djaygit commented 7 years ago

@levantAJ live filter works well but video recording is returning black screen image, any idea why is that happening?