Open canpoyrazoglu opened 10 years ago
I've found that it's about the setSessionPreset: just before taking the photo. I was calling setSessionPreset: to change the preset just before taking the photo, to be able to preview realtime at low resolution at high framerate, and capture at high resolution. I've found that in iOS 7 the practice has changed: http://stackoverflow.com/questions/19579962/unable-to-set-session-preset-while-capture-session-running-in-ios-7.
I will try the new approach now.
Having the same problem, but the linked item didn't help (tho may not have implemented correctly). Were you able to solve it?
I am creating an iOS camera app that can take both front-facing and back-facing shots. There is no problem with back facing shots. However, when I try to take an image with the front facing camera, the photo in preview (GPUImageView) displays perfectly:
When I take the image, the resulting image is extremely dark, almost black:
I am alpha-blending the raw camera input with a camera->lookup filter's result. I've seen this post: https://github.com/BradLarson/GPUImage/issues/907 however, the trying the answer didn't make any difference for me. The camera, including the front camera, used to work perfectly before. I haven't worked on the project for a while, and now it's the result. What could be the reason?