Closed artoq closed 12 years ago
What you're looking for is the GPUImageLuminosity class I added a few days ago. When you set the luminosityProcessingFinishedBlock like the following:
[(GPUImageLuminosity *)filter setLuminosityProcessingFinishedBlock:^(CGFloat luminosity, CMTime frameTime) {
// Do something with the luminosity here
}];
every image or video frame passed into the GPUImageLuminosity output will be processed and will return an average luminance. The frameTime is the timestamp from the camera, but you might be safer in logging the current time that you read at that point.
One thing to watch out for is that with automatic exposure on, the iOS cameras adjust gain and shutter speed to achieve a constant average luminance. Therefore, you'll see this value hover around 0.5 unless you lock the exposure at some point.
Oh, okey found it... Nice mindreading skills!
Thanks alot for response...
Hi Brad, May I knwo how to get the luminosity in GPUImageLuminosity in swift? Thanks!
Hi there,
I'm trying to use (awesome) GPUImage to get some useful data from the camera. Therefore: Is there any way to use a filtered frame beside displaying it on the screen, like save it or use it for further calculations?
For e.g.: What I'm trying to do is get a frame from a live video, calculate the average luminescence and save this value including it's timestamp into a table.
Well, I guess the monochrome filter is a beginning but I hope I made my point: Where to go from here - can I use a framelike output and use it the way I mentioned before?
Thanks for helping a "lost in opengles" guy ;)