BradLarson / GPUImage2

GPUImage 2 is a BSD-licensed Swift framework for GPU-accelerated video and image processing.
BSD 3-Clause "New" or "Revised" License
4.85k stars 605 forks source link

Is it possible get processed frames from RenderView as CVPixelBuffer (CVBuffer)? #218

Closed cagkanciloglu closed 6 years ago

cagkanciloglu commented 6 years ago

Hey, I am trying to use processed frames shown on RenderView in Core ML. For that CVPixelBuffer is required. How can this be archived? By the way CVPixelBuffer is an actually a typealias of CVImageBuffer which is also a typeallias of CVBuffer. So these types work as well.

I also checked RawDataOutput but I want to show these processed frames on screen as well

BradLarson commented 6 years ago

Yes, you can use a raw data output to capture the bytes at the step before they are displayed by the RenderView, and then create a pixel buffer from those.

cagkanciloglu commented 6 years ago

Thanks for the reply @BradLarson The thing I dont understand is that RawDataOutput is an ImageConsumer because of that I cant do something like this for example camera --> filter --> rawdataoutput --> renderview

Only way I can use renderview if I use like this camera --> filter --> renderview

where can i add rawdataoutput in this setup?

BradLarson commented 6 years ago

You can add it in parallel:

camera --> filter --> renderview filter --> rawdataoutput

and the filter will output to both the raw data output and the RenderView.

cagkanciloglu commented 6 years ago

hmm but will this effect the performance ? isnt this like processing frames twice or not?

BradLarson commented 6 years ago

No, the only thing that will impact performance is the actual extraction of the bytes. The filter only runs once and provides the same texture to both the RenderView and the raw data output. You're going to need to get the bytes out some way, and the raw data output is how you do that.

I don't believe I yet have my fast path for grabbing these bytes from the raw data output in this version of the framework, but if you need that you can examine what I did in the Objective-C one.

cagkanciloglu commented 6 years ago

Will do that for sure. Thank you so much for your help @BradLarson

Briahas commented 6 years ago

Hi @BradLarson, I'm trying to connect GPUImage with GVRKit. And for that I need an CVPixelBuffer as output, from GPUImage, to be an input to GVRKit. Right now I'm doing it straight forward (from comment ) filter --> rawoutput then dataAvailableCallback then Data.withUnsafeBytes then CVPixelBufferCreateWithBytes

But, as soon as I add rawoutput as additional output - it drastically reduce performance: from 20fps to 15 fps on iPhone6Plus. It doesnt depend on dataAvailableCallback operations.

Is it possible to avoid such behavior? Thanks.