BradLarson / GPUImage2

GPUImage 2 is a BSD-licensed Swift framework for GPU-accelerated video and image processing.
BSD 3-Clause "New" or "Revised" License
4.85k stars 605 forks source link

Array buffered frame buffer as input for the next filter. #211

Open CVdim opened 6 years ago

CVdim commented 6 years ago

Hi! First off, this library is simple and clever!

I’m porting an algorithm from Mathlab/CUDA to Swift/GPUImage and trying to create filters chain, where first filter is buffered ten frames from camera and receive to the next filter for data analysis.

camera --> buffer --> myFilter -->...

I don't understand how to realize "buffer" filter. Class ImageBuffer() can create array buffered frames, but receive only one frame to target.

doudouperrin commented 6 years ago

Hi, I m not sure to understand your need (especially about the 10 frames buffer). Sorry if I'm out of subject! ;)

If you are confident with openGL shaders you can write your own filter extending BasicOperation and unit it with a shader string or file. So you can use it like this: Camera --> customShaderFilter --> myFilter --> renderView The shader program will process your video frame by frame, not sure it is what your are looking for.

You may also use the camera delegate method, with which you can process directly a CMSampleBuffer (one frame) before it is processed by GPUImage pipeline. From here, you may store each frame in an array, and using a counter, process your frames with your algorythm when your array size is 10. But you may face issue with buffer being dealloc... (I m not an expert ^^) Note that the delegate method is run on CPU, not in GPU, it can produces slowness though.

CVdim commented 6 years ago

Hi, Thanks for answer :) Need buffered 10 frames and send to my custom shader. ImageBuffer() use array FrameBuffer object and send to target shader, but only one frame use updateTargetsWithFramebuffer() method. I want send all buffered frames from [Framebuffer] to the next filter simultaneously.