BradLarson / GPUImage

An open source iOS framework for GPU-based image and video processing
http://www.sunsetlakesoftware.com/2012/02/12/introducing-gpuimage-framework
BSD 3-Clause "New" or "Revised" License
20.23k stars 4.61k forks source link

GPUImage with Chroma key filter to Live stream #2467

Closed rajatkj closed 7 years ago

rajatkj commented 7 years ago

I am building an app in which we can add Green screen effect to the Video Chat on both sides(sender and reciever). I am using OpenTok for Video Chat and conferencing.

The problem I am facing right now is that the method func willOutputSampleBuffer(_ sampleBuffer: CMSampleBuffer!) gives the sampleBuffer from which i create an image using

extension CMSampleBuffer {
    func image(orientation: UIImageOrientation = .up, scale: CGFloat = 1.0) -> UIImage? {
        guard let buffer = CMSampleBufferGetImageBuffer(self) else { return nil }

        let ciImage = CIImage(cvPixelBuffer: buffer)

        let image = UIImage(ciImage: ciImage, scale: scale, orientation: orientation)

        return image
    }
}

Image generated by this method does not have Green filter applied. Is there any way to add the Chromakey filter to the buffer?

BradLarson commented 7 years ago

The sample buffer you get from the camera is just that: the sample buffer at the time the camera captured it. If you want the results of any GPU-side processing, you'll need to extract them after the point where the processing has occurred.

The most effective means of doing this would be to use the raw data output and have it extract data after the point at which your chroma keying has occurred.

rajatkj commented 7 years ago

I am having difficulties understanding use the raw data output and have it extract data after the point at which your chroma keying has occurred.

How/When will I know that GPUImageChromaKeyBlendFilter is working and from where can i extract raw data after GPU processing.

Or on the other hand if you can suggest

Is it possible to add GPUImageChromaKeyBlendFilter to CMSampleBuffer/YUV planes Directly?

BradLarson commented 7 years ago

The raw data output provides a callback to let you know when new bytes are available. This is due to the asynchronous nature of image processing (in the same way that you get a callback for new camera frames). You just need to provide your code in the callback that will take these bytes on a processed frame and do something with them.