Closed cagkanciloglu closed 6 years ago
Yes, you can use a raw data output to capture the bytes at the step before they are displayed by the RenderView, and then create a pixel buffer from those.
Thanks for the reply @BradLarson The thing I dont understand is that RawDataOutput is an ImageConsumer because of that I cant do something like this for example camera --> filter --> rawdataoutput --> renderview
Only way I can use renderview if I use like this camera --> filter --> renderview
where can i add rawdataoutput in this setup?
You can add it in parallel:
camera --> filter --> renderview filter --> rawdataoutput
and the filter will output to both the raw data output and the RenderView.
hmm but will this effect the performance ? isnt this like processing frames twice or not?
No, the only thing that will impact performance is the actual extraction of the bytes. The filter only runs once and provides the same texture to both the RenderView and the raw data output. You're going to need to get the bytes out some way, and the raw data output is how you do that.
I don't believe I yet have my fast path for grabbing these bytes from the raw data output in this version of the framework, but if you need that you can examine what I did in the Objective-C one.
Will do that for sure. Thank you so much for your help @BradLarson
Hi @BradLarson, I'm trying to connect GPUImage with GVRKit. And for that I need an CVPixelBuffer as output, from GPUImage, to be an input to GVRKit. Right now I'm doing it straight forward (from comment ) filter --> rawoutput then dataAvailableCallback then Data.withUnsafeBytes then CVPixelBufferCreateWithBytes
But, as soon as I add rawoutput as additional output - it drastically reduce performance: from 20fps to 15 fps on iPhone6Plus. It doesnt depend on dataAvailableCallback operations.
Is it possible to avoid such behavior? Thanks.
Hey, I am trying to use processed frames shown on RenderView in Core ML. For that CVPixelBuffer is required. How can this be archived? By the way CVPixelBuffer is an actually a typealias of CVImageBuffer which is also a typeallias of CVBuffer. So these types work as well.
I also checked RawDataOutput but I want to show these processed frames on screen as well