Closed rajatkj closed 7 years ago
The sample buffer you get from the camera is just that: the sample buffer at the time the camera captured it. If you want the results of any GPU-side processing, you'll need to extract them after the point where the processing has occurred.
The most effective means of doing this would be to use the raw data output and have it extract data after the point at which your chroma keying has occurred.
I am having difficulties understanding use the raw data output and have it extract data after the point at which your chroma keying has occurred.
How/When will I know that GPUImageChromaKeyBlendFilter is working and from where can i extract raw data after GPU processing.
Or on the other hand if you can suggest
Is it possible to add GPUImageChromaKeyBlendFilter to CMSampleBuffer/YUV planes Directly?
The raw data output provides a callback to let you know when new bytes are available. This is due to the asynchronous nature of image processing (in the same way that you get a callback for new camera frames). You just need to provide your code in the callback that will take these bytes on a processed frame and do something with them.
I am building an app in which we can add Green screen effect to the Video Chat on both sides(sender and reciever). I am using OpenTok for Video Chat and conferencing.
The problem I am facing right now is that the method
func willOutputSampleBuffer(_ sampleBuffer: CMSampleBuffer!)
gives the sampleBuffer from which i create an image usingImage generated by this method does not have Green filter applied. Is there any way to add the Chromakey filter to the buffer?