Closed vanderlin closed 6 years ago
I don't know if it's helpful but I've always read back at a smaller size to help with frame rate -- drawing the camera image into an fbo that's smaller and reading back the pixels from that...
I think this might actually be an issue with how openframeworks deals with copying pixels out of a texture rather than anything with ofxARKit. I think you might need to draw the camera image into a fbo before reading to pixels. See here:
https://forum.openframeworks.cc/t/offbo-in-ios-readtopixels-issue/7392/6
And also here... https://github.com/openframeworks/openFrameworks/issues/753
Hey @vanderlin !
It sounds like Adam found a possible root cause. I would start there.
Also apologies are in order on my part - you've stumbled onto an oversight I hadn't thought of. In case you hadn't already noticed in the code, getting a 4000x4000 texture from the FBO is not unexpected. This was done in order to support rendering the camera image on as many devices as possible in all possible orientations as well as to try and fix another issue.
I'm working on a refactor of the Camera class at the moment in my spare time - I'll keep this issue in mind as I work and see if I can make things a little less confusing. I should probably just get rid of the getter hah.
In any case, like Adam and Zach suggested, the way around this is to draw the camera image into your own FBO; I've tried this myself in the past with success.
If you wanna roll your own method and mange the pixel buffer it hopefully shouldn't be too difficult to understand, I'd take a look at ARCam::buildCameraFrame
and ARCam::createTextureFromPixelBuffer
(there is a tiny mistake I'm aware of which I've fixed in another branch as I've refactored) as well as the shaders in ARShaders.h
closing due to lack of activity to keep things tidy. If there's still an issue, please feel free to re-open!
Hello, Been trying to read pixels from the camera first I thought I could read them in from
getCameraTexture()
Looks like this is not working, I am getting a texture size of 4000x4000 as well. Without going down the road of trying read
CVPixelBufferRef
from the ARFrame, is there something I'm missing.