Closed 311-code closed 10 months ago
Sorry, I don't get what you mean by "3D side-by-side VR180 video support ". Is it about 180 deg video, but with no depth information? So the program can add inferred depth to it? In that case, one can generate a (hemi-)sphere mesh and then displacing the vertices.
I will try to explain better. Apologies for the wall of text.
It will have the depth infomation still from the left and right side, but I am making a guess that after we load a 180 degree 3d sbs video into depthviewer (which currently shows left and right eyes separated in the space) that maybe I could make a script that allows for a checkbox in the gui, then the script would cut the mesh in half down the middle, then physically moves the left part of video (the mesh on left and the depth map on the left) and the right part of video (the mesh on the right and depth map on the right) and puts them in the same spot in the physical Unity space which would overlay them into one image. This might not work though, it was just something I thought could be worth trying for VR.
I am using depthmaps with beit, I can see if creates very decent 3d mesh for both sides as it is now for a 3D video.
With the addition of 6dof positional tracking option (now at the correct scale with this openvradvanced settings workaround) my hope is we will notice an increased and more accurate 3d volume due to the left and right/depthmap meshes and images being moved to the same position and increased depth cues from the positional tracking or simply being able to lean around things further without the 3D breaking as soon.
The 6dof setting actually does enable 6dof for the vr headset for me btw (should be on default for VR mode if detected) I just couldn't tell before because object scale is too large until I did the playspace offset fix and moved myself to the location of the scaled down mesh.
I posted this on the other post but just in case someone is reading and missed it: Use the program openvradvanced settings and force the "playspace offset" settings (Since there is no true forward/backward mesh movement option in depthviewer it appears now) or use spacegrab in ovradvanced settings https://www.youtube.com/watch?v=hFSzvnffNww the default VR camera is very far away in Unity (-150 z units) from the large scaled mesh object.
Just trying to further enhance the 3d quality and 3d volume here by perhaps figuring out a way to force the left and right meshes to combine.
But to get to the point, I was going to try to adjust all of this in Unity myself and submit it to you but cannot get past the onnx dll errors. :(
Hopefully this very confusing drawing will make sense haha. This idea here is either really bad, or really good and would maybe add more detail I am hopingl. I tried to break it down in this image so it's easier understand,.
The current view part of image is what you basically see in VR. (depthviewer doesn't combine the left and right into one image for VR for 3d sbs content)
The bottom is what I am trying to achieve where it is combined (I mean in the unity space, not messing with any camera frustum or offsets). The only side effect I could image is that left and right eye textures would maybe combine, as it would be in the same place in the Unity coordinate space, and it maybe wouldn't look right. I can't really image it yet until I try it.
If a texture from only the left eye were overlayed on both the left and right eye's depthmap meshes, I wonder if that would increase the 3D volume also? Since there is more depthmap mesh volume then. or it could be that the left texture doesn't map onto the right depthmap mesh well and maybe would look bad.
Of course none of this would apply to 3D glasses users or people using left/right sbs hack with a 3d monitor like threedeejay, it's just some ideas I wanted to try out but can't at the moment due to the project dll errors.
It's an interesting approach, though I still don't fully get it 😅 Sorry. But it's about 6DoF, right? I'd put here some related points:
Yes it's for enhancing 6dof. I have indeed used sphere mesh in programs like DEO player but thinking more about adding positional tracking to the 3d sbs vr videos without nerf/guassian splatting and single frame stuff. I think I will focus on on fixing the Unity project for now for VR so I don't bug you with these ideas, but these are all good points you have here to consider.
while I was using this app I was thinking about how the monoscopic to VR video conversion works so well in this even with just Midas, Beit and Marigold being mindblowing. follow this post
If we were to add 3D side-by-side VR180 video support with the realtime depthmaps (and marigold if if ever magically gets optimized for realtime) would this allow for VR180 videos with better 3D and postitional tracking?
I just loaded a VR video from my VR180 camera and the left and right side appear and has depthmaps on both sides, it just needs to be combined into one image somehow. This could make for a much more comfortable and immersive viewing experience due to the 3D being better from the side by side image having more views.
I've realized this app actually does have 6dof positional tracking for the headset, the problem was the objects are so large and far away that it feels like 3dof, (Edit: the VR camera is very far away by default) Is there no way to physically move forward, backwards in the space?