sensics / OSVR-RenderManager

Apache License 2.0
63 stars 45 forks source link

VBO-only implementation for GLES2 #209

Closed JeroMiya closed 7 years ago

JeroMiya commented 7 years ago

The current OpenGL implementation makes use of VAO extensions, which may not work on all GLES2.0 platforms.

Two possible solutions: 1) Check for the VAO extension at runtime, use VAO if available, VBO-only if not available. 2) Always use VBO-only implementation when compiling for OpenGLES 2.0.

VBO-only is easier to implement but may not be as fast on devices that DO support VAO.

Also, we are not restoring the VAO binding after a present call. See: https://github.com/sensics/OSVR-RenderManager/issues/208

russell-taylor commented 7 years ago

VAO restoration has been fixed in a pull request. Looking into VBO implementation early next week.

russell-taylor commented 7 years ago

Fixed in 1c9bab144f9010b030f25a21067f0f7a1d19c8b5

russell-taylor commented 7 years ago

Benchmarking at http://www.openglsuperbible.com/2013/12/09/vertex-array-performance/ shows that the use of VBO only takes less than half a microsecond. The use of VAO can be up to four times faster, but all of this is way down in the noise (function-call level, not fractions of a millisecond). I think that the potential risk of code drift due to multiple paths is larger than the relative cost of VAO objects, so plan to implement VBO only on all code paths.

JeroMiya commented 7 years ago

If the platform has the VRAM for it (maybe have an option defaulted to off), couldn't we just bake the uv coordinates into a screen-sized texture once with the full mesh, and then just draw a single quad (6 vertices instead of almost 40,000) when rendering distortion/time-warp?

mdutton3 commented 7 years ago

You can find one comparison of the methods here: http://rifty-business.blogspot.com/2014/02/distortion-methods-in-rift-and-their.html?m=1

On Wed, Sep 7, 2016, 12:46 PM Jeremy notifications@github.com wrote:

If the platform has the VRAM for it (maybe have an option defaulted to off), couldn't we just bake the uv coordinates into a screen-sized texture once with the full mesh, and then just draw a single quad (6 vertices instead of almost 40,000) when rendering distortion/time-warp?

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/sensics/OSVR-RenderManager/issues/209#issuecomment-245343826, or mute the thread https://github.com/notifications/unsubscribe-auth/ABHLGBFSrliqiqvnS29wlqSMCH8czvq3ks5qnupMgaJpZM4JzJUE .

Matt Dutton 404-406-1312 (cell)

russell-taylor commented 7 years ago

We could, but then we have to do another texture lookup for every pixel. I expect this to take much longer than rendering the triangles and interpolating the coordinates.

russell-taylor commented 7 years ago

The idea of using a lower-resolution texture had not occurred to me. That makes the texture-based method more appealing. If this turns out to be a limiting factor for some application, open a ticket for it and I can go through the rendering approach with a fine-toothed comb and figure out how hard it will be to change things over. We're doing some transformations on the mesh and some in other spaces for time warp, so I need to think about how to adjust things.