Open dsanjit opened 6 years ago
Stereo rendering is not currently supported. It's something we'd like to have eventually.
@romainguy Is there anything to stop us implementing this ourselves as two View's (one for each eye) into the same Scene?
That would work unless you need a custom post process step (for foveated rendering etc.). It also wouldn't be as efficient as if the engine supported VR natively.
@romainguy Any chance of a hint on the best approach to GoogleVR SDK integration? The lifecycle is based around GvrView, but the underlying Surface object it creates is not exposed so I'm not sure how I can create a Fragment SwapChain that targets the underlying buffers or if that's in fact what I should be doing at all.
I read that we can't just retrofit Fragment into our GLSurfaceView applications and carry on as before so I'm wondering if the same is true of libraries like GoogleVR that manage the underlying surface lifecycle internally.
Interesting, the way GrView works is not what I would have expected (and it's incompatible with Vulkan too). We'll take a look. Libraries should not manage the surface like this.
I just found a lower level that seems to be what you/we want: https://developers.google.com/vr/reference/android/com/google/vr/ndk/base/GvrLayout
That looks promising! I will report back (and thanks again)
I can get a trivial rendering working with GvrLayout:
Obviously this is ignoring the stereo image requirements and the lens distortion correction post-processor, which might be challenging to integrate as it appears to be OpenGL specific.
I'm going to mark this down as possible but complicated and circle back once better minds than mine have done the heavy lifting. I've got everything I need for integrating Fragment with my own VR SDK now, so I just needed to confirm that 3rd party VR SDK integration was going to be possible in the future.
Looks like we'll need to add support indeed. And it looks like it will be GL specific and not work with Vulkan :/
Stereo rendering is not currently supported. It's something we'd like to have eventually.
I'm working on using the Filament engine to vr headset like oculus quest or htc vive,find that GLSurfaceView is not supported, can you give some methods or suggestions about how to run filament engine by GLSurfaceView. Very thanks ~
@oahc09 I haven't actually done it, but the Oculus mobile SDK appears to support integration with graphics libraries that need to own the EGL context as long as you respect various overlapping lifecycles.
@oahc09 You cannot use Filament with a GLSurfaceView
as Filament needs to own the EGL context. What you'll have to do is integrate whatever VR SDK you need into Filament's backend (filament/backend
in the source tree). It's something we want to do eventually.
@oahc09 You cannot use Filament with a
GLSurfaceView
as Filament needs to own the EGL context. What you'll have to do is integrate whatever VR SDK you need into Filament's backend (filament/backend
in the source tree). It's something we want to do eventually.
Yes, I'm trying to do some integrate with vr sdk by using ndk, and change the filament engine.cpp/platformegl.cpp code, but just in exploration stage, because i'm not very familiar with the render framwork. If you have some flow charts about the render framework , can you share them? It may by very helpful to me. Thanks very much.
Any progress on VR support please? Or maybe I missed something...
@dsanjit I don't believe there are any plans for specific VR support in Filament at the moment. I have managed to use Filament successfully with at least one VR SDK and I think the method would extend to the others. Happy to sketch out my current approach if it would help.
Any progress on VR support please? Or maybe I missed something...
You can refer to the branch : https://github.com/NibiruDev/filament_V1.4.0_Nibiru It makes filament suppored on vr hmd。
@rawnsley Thanks! Did you use GL_OVR_multiview or similar to do multiview rendering ? If yes some details would help.
@oahc09 Thanks will check !
@dsanjit I don't think Filament supports multiview rendering. I use two Filament Views (one for each eye) that render to texture buffers that I then submit to the VR engine for display. Thread and EGL context management is a bit painful, but possible.
One problem is that you need to wait until Filament has finished rendering before sending the frame to the VR SDK, but the only option is currently flushAndWait. This hurts efficiency and limits the frame rate, but in most VR SDKs the problem is reduced by asynchronous timewarp (which decouples the head rotation from the frames being submitted). In 6DoF headsets or scenes with a large amount of movement (independent of the head rotation) the artefacts will be worse.
Any updates on VR support for Filament ?
Would like to use Filament in a Daydream / Oculus GO / Rift project. Is stereo supported along with multiview and other optimizations ?