Open Aupuma opened 1 year ago
Since I know nothing about either URP nor VR, and I don't have any VR devices around, then no idea. If someone figures it out, let me know!
I might take a look at this at some stage as I am particularly interested in Gaussian Splats for VR (seems like the perfect use-case)
In the meantime there is also: https://github.com/clarte53/GaussianSplattingVRViewerUnity
Those debug modes seem to render betterr in URP+VR(multipass), compared to splats.
so i'm guessing its something to do with RenderGaussianSplats.shader,
float4 centerClipPos = view.pos; // what is view.pos about?
Seems like there is an issue with the camera projection matrix on URP with stereo rendering not exactly sure but seems like the matrix return at this line is wrong or doesn't take into account both eyes.
Managed to fix it but using Unity built-in transformations matrices but the result is not exactly the same, looks like the splat scale is slightly smaller but at least it fixed the rendering issue. Here is how SplatUtilities.compute looks like for me now.
Added include for Unity built-in stuff:
#include "UnityCG.cginc"
And changed those 2 lines:
line 81 float4 centerClipPos = mul(UNITY_MATRIX_VP, float4(centerWorldPos, 1));
and 95 float3 cov2d = CalcCovariance2D(splat.pos, cov3d0, cov3d1, _MatrixMV, UNITY_MATRIX_P, _VecScreenParams);
Hi, congrats for the effort on the project! I've been testing it with a VR headset, and I was able to see splats using OpenXR in Multi-Pass mode (Single pass instanced does not work). However, I tried to switch to URP to get some performance benefits but the rendering is reversed, the Left image renders on the right eye and viceversa. Any ideas on how to fix it?
Hi, could you please share some hints about how to test it with openXR?
I have been chasing what I think could be a related issue with #52. In my case, I am trying to get rendering working on a LookingGlass light field display. The issue I believe I found is that if the render target is set to something custom for a camera the render target will be changed to BuiltinRenderTextureType.CameraTarget by GaussianSplatRenderer ignoring the custom target.
Would someone who has this issue be willing to test after commenting out line 190 in GaussiannSplatRendere.cs. This will cause SortAndRenderSplats to render directly to the original render target. This hack skips the compositing step of rendering though so the image will be washed out. https://github.com/aras-p/UnityGaussianSplatting/blob/fded9c09d041e862502acc57d1bbc4c084ae4e6f/package/Runtime/GaussianSplatRenderer.cs#L190C31-L190C31
It's a real long shot but could this theoretically work on stand-alone VR platforms like Quest? It works amazing in the Editor, but several parts of the Compute Shaders fall over when you change the platform to Android.
I wonder if they're minor changes? I had a look at the source but it's black magic to me! It would be life-changing to have splats in stand-alone VR.
For those looking for a VR implementation with the Built-In Render Pipeline (BIRP), I have just put the code for my project online here: https://github.com/ptc-lexvandersluijs/Unity3DGS_VR . This is made with multi-pass rendering, Unity 2023.1.14f and tested with OpenXR / HTC Vive.
The hack to make URP VR left eye left, right eye right mentioned in:
With current 08a270a codebase is more like this:
diff --git a/package/Shaders/SplatUtilities.compute b/package/Shaders/SplatUtilities.compute
index a23a282..718d63d 100644
--- a/package/Shaders/SplatUtilities.compute
+++ b/package/Shaders/SplatUtilities.compute
@@ -30,6 +30,7 @@
#include "DeviceRadixSort.hlsl"
#include "GaussianSplatting.hlsl"
+#include "UnityCG.cginc"
float4x4 _MatrixObjectToWorld;
float4x4 _MatrixWorldToObject;
@@ -193,7 +194,8 @@ void CSCalcViewData (uint3 id : SV_DispatchThreadID)
SplatViewData view = (SplatViewData)0;
float3 centerWorldPos = mul(_MatrixObjectToWorld, float4(splat.pos,1)).xyz;
- float4 centerClipPos = mul(_MatrixVP, float4(centerWorldPos, 1));
+ //float4 centerClipPos = mul(_MatrixVP, float4(centerWorldPos, 1));
+ float4 centerClipPos = mul(UNITY_MATRIX_VP, float4(centerWorldPos, 1));
half opacityScale = _SplatOpacityScale;
float splatScale = _SplatScale;
@@ -229,7 +231,8 @@ void CSCalcViewData (uint3 id : SV_DispatchThreadID)
float splatScale2 = splatScale * splatScale;
cov3d0 *= splatScale2;
cov3d1 *= splatScale2;
- float3 cov2d = CalcCovariance2D(splat.pos, cov3d0, cov3d1, _MatrixMV, _MatrixP, _VecScreenParams);
+ float3 cov2d = CalcCovariance2D(splat.pos, cov3d0, cov3d1, _MatrixMV, UNITY_MATRIX_P, _VecScreenParams);
+ //float3 cov2d = CalcCovariance2D(splat.pos, cov3d0, cov3d1, _MatrixMV, _MatrixP, _VecScreenParams);
DecomposeCovariance(cov2d, view.axis1, view.axis2);
Another thing that seems necessary is setting Unity editor game view resolution to match VR single eye resolution.
Reversing eyes is new funny way to get headache.
For those looking for a VR implementation with the Built-In Render Pipeline (BIRP), I have just put the code for my project online here: https://github.com/ptc-lexvandersluijs/Unity3DGS_VR . This is made with multi-pass rendering, Unity 2023.1.14f and tested with OpenXR / HTC Vive.
Hello, sorry to bother, but I was trying to get the current repo working with quest 3 and for some reason the rendered models seem to follow the headset around, instead of remaining fixed in the desired positions. This seems to be an issue with how the XR package sets the axis of the projection or the view matrices ( I'm not sure), but I was wondering if you ran into any issues when porting the project to work with HTC vive, and I couldn't quite figure out what you changed from the original code. Could you please tell me exactly what you changed to make it work?
Hello, sorry to bother, but I was trying to get the current repo working with quest 3 and for some reason the rendered models seem to follow the headset around, instead of remaining fixed in the desired positions. This seems to be an issue with how the XR package sets the axis of the projection or the view matrices ( I'm not sure), but I was wondering if you ran into any issues when porting the project to work with HTC vive, and I couldn't quite figure out what you changed from the original code. Could you please tell me exactly what you changed to make it work?
If the models follow you around, the first thing that comes to mind is that the 6DOF tracking of the headset isn't working at all / isn't coming through to the rendering software. It has been a while since I last worked on anything VR-related, so I'm afraid I can't give very specific advice. There is an elaborate controller framework in Unity that deals with mapping all of the various input device-types and -manufacturers to specific control operations, so that's something you could look into.
Although the very first thing that I would do in your situation is to try to find a 'known good' Unity example that works with your hardware, and -if that works- then perhaps try to find the differences between both implementations.
Hello, sorry to bother, but I was trying to get the current repo working with quest 3 and for some reason the rendered models seem to follow the headset around, instead of remaining fixed in the desired positions. This seems to be an issue with how the XR package sets the axis of the projection or the view matrices ( I'm not sure), but I was wondering if you ran into any issues when porting the project to work with HTC vive, and I couldn't quite figure out what you changed from the original code. Could you please tell me exactly what you changed to make it work?
If the models follow you around, the first thing that comes to mind is that the 6DOF tracking of the headset isn't working at all / isn't coming through to the rendering software. It has been a while since I last worked on anything VR-related, so I'm afraid I can't give very specific advice. There is an elaborate controller framework in Unity that deals with mapping all of the various input device-types and -manufacturers to specific control operations, so that's something you could look into.
Although the very first thing that I would do in your situation is to try to find a 'known good' Unity example that works with your hardware, and -if that works- then perhaps try to find the differences between both implementations.
Your implementation works on my Quest, but I can't seem to find anything different from the base version of the repository. Do you remember which Unity settings you changed to make it work?
When I try to set it up with the base version of the repository, I get a very weird behavior where the model follows the head movement (it doesn't stay fixed in place in the scene). This seems to be caused by incorrect matrices being sent to the shader. I tried to fix this error by getting the view and projection matrices for each eye in the multi-pass and setting these directly to the shader, but nothing changed.
Thank you for your previous response. Having it rendered on my headset is already very nice.
Thanks for this awesome package, for us researchers its a great step forward into the psychological research!
This might be a very specific question for the VR-related people but I will try my luck:
I am using the Varjo Aero (Varjo Base Version 4.3.0.14) with Unity (Version 2022.3.12f1) to implement some examples. However, I’ve encountered an issue where the scene only renders correctly in multi-pass mode. In this mode though, I see unwanted "reflections" of my splats on the left and right edges of the display (see Picture 1). I not using any of the URP or HDRP render modes, I run the simple GaussianExample Scene.
I suspect this may be due to Varjo's focus rendering in multi-pass mode, as explained in the documentation:
“VarjoStereoRenderingMode.Multipass — The scene is rendered in four separate passes: one for each view (left context, right context, left focus, right focus).”
To work around this, I attempted using other rendering modes, ideally the stereo mode where focus views aren't rendered. Unfortunately, this approach leads to errors (see Picture 2), and I haven’t been able to run the scene in two-pass or stereo rendering modes.
Has anyone encountered similar issues with non-multi-pass modes or know why this might be happening?
Thank you very much in advance for any advice!
I have worked on VR support for Gaussian Splatting for a recent publication. Feel free to use the code from here: https://github.com/ninjamode/Unity-VR-Gaussian-Splatting
Tested to work on Vive / Varjo / Quest headsets. Change a bunch of things for better VR support, including not strictly needed things like sorting only once for the center point between both eyes. It's a little messy, I probably also broke some stuff while working on it. There is also a .apk file to download and try on Quest headsets.
@aras-p I could create a merge request if you are still interested in this repository to bring at least the basic functionality over to make vive and quest work?
I could create a merge request if you are still interested in this repository to bring at least the basic functionality over to make vive and quest work?
@ninjamode yes that would be most excellent!
Hi, congrats for the effort on the project! I've been testing it with a VR headset, and I was able to see splats using OpenXR in Multi-Pass mode (Single pass instanced does not work). However, I tried to switch to URP to get some performance benefits but the rendering is reversed, the Left image renders on the right eye and viceversa. Any ideas on how to fix it?