Open Consti10 opened 4 years ago
If you have a well tessellated world, it could be possible to distort using the vertex shader. Bear in mind that this implies changing the current pipeline (e.g., CardboardDistortionRenderer_renderEyeToDisplay() should not be called anymore).
The GVR library only supported the same post process distortion that Cardboard is doing now.
How does gvr handle async reprojection then ?
As mentioned here, async reprojection involves adjusting the position of the rendered frame just before it is seen by the user. This requires a special hook from a phone's display driver, ideally per-scanline. On Daydream ready phones, Google VR Services is able to subscribe to this hook; all other phones and other apps do not have access to this hook.
Hello, My question was more about how the gvr shaders do the 'position adjusment', since there is a lot of complex math involved. Unfortunately, this interesting part was not open sourced. Even though you can obviosly check out john carmacks original work: repo
I managed to get half-screen warp running on select phones without the gvr service, but the main issue is that android does not expose the exact timing information of the display, and therefore I have to use the choreographer workaround.
I am wondering: Assuming you are only placing a 2D canvas (for example a UI element) in VR 3D space. Is it possible to use projective texturing to map that canvas onto the distortion mesh from the view of 'HeadSpaceFromStartSpace' translation matrix ? Then you could create a simple Shader that takes a 2D surface and a translation matrix for the position of the 2D surface in 3D space that does all the distortion correction in one render pass. I assume in the gvr_lib you are doing something similar but obviously it is not open source.