pupitetris / cesium-vr

Plugin for Cesium web-based virtual globe software to support the Oculus VR headset
Other
3 stars 1 forks source link

Works on Meta Quest 2, but not on Meta Quest Pro #1

Open rumicuna opened 10 months ago

rumicuna commented 10 months ago

The demo works amazingly using a Meta Quest 2 (except for a small issue with the right eye visuals shaking a bit), but when using a Meta Quest Pro, the are two issues (listed below). Note that these issues can't be reproduced by using the Chrome 'Immersive Web Emulator' simulator developed by Meta and choosing the 'Meta Quest Pro' option. One has to use an actual 'Meta Quest Pro' headset to reproduce the issues.

The issues are:

  1. What the left eye and the right eye are pointing to are ways off so the stereo illusion does not work at all.
  2. When moving ones head, the warping and fov are wrong so the world just warps in a way that does not match the rotatio of ones head, preventing the illusion of being floating above the globe.

I've instanced a live WebXR demo of cesium-vr to allow anyone to reproduce (and confirm) this issue, you can run it here: https://rumicuna.github.io/cesium-vr/htdocs/

Note that I have successfully ran all the WebXR demos below on my Quest Pro so the issues do not seem to be related to my hardware: https://immersive-web.github.io/webxr-samples/

@pupitetris, you mentioned that you own a Meta Quest 2 headset, but I'm hoping that even though you probably can't reproduce the issues yourself since you might not have a Meta Quest Pro available to you, you will have some insights on how to fix this issue: maybe there's some flags we have to add to make it work on the Quest Pro, or you have ideas on what code I would need to change. The FOV is different between the Quest 2 and the Quest Pro, and I did find some code you wrote where the FOV is defined, but changing the values there did not have the desired effect. I have both a Quest 2 and a Quest Pro and the immersive-web demos above all works well on both headsets, so there seems to be some code that they are using to detect the headsets and modify the warping appropriately that maybe you are missing? You also mentioned issues with running your Cesium code on a Quest 2 related to performance, but I can report that when using a Quest Pro (because of the doubling of performance of the headset hardware) your code runs amazingly well and is perfectly usable (by closing each eye to avoid seeing the issues above I can tell that the performance is there to run Cesium smoothly).

Thank you for all your extremely hard work, other than these small solvable issues you have pretty much solved the issue of running Cesium on VR headsets and all my colleagues and I are so thankful, as I'm sure are many many other people!

rumicuna commented 10 months ago

Here's a visual of what I'm seeing on the Quest 2 vs the Quest Pro: in the Pro, even though I can tell that each eyes is pointing at generally the same area when one rotates ones head, they are still quite a ways off so that visual matching when using the headset is impossible (note that there's also the fov/warping mismatch that I can't show on the graphics below which seems to be a different issue).

image image
pupitetris commented 10 months ago

IPD (inter-pupillar distance) and projection transform matrixes are provided by WebXR to the application. If I can't reproduce the problem by specifying a Quest Pro headset in the emulator, maybe providing those parameters to me here could help in trying to understand why the projections are so off.

The relevant location is VRPose.js:153

This function is called once per eye to calculate the camera frustum and the delta transformation that will shift the view to the left and to the right before each eye's render. It is here where the input matrices can be obtained, to see what is going on. Breaking in this location, provide dumps for each eye of:

xrView.eye xrView.transform.inverse.matrix xrView.projectionMatrix xrPose.transform.inverse.matrix

I'm guessing these are calculated differently on the Quest Pro for some reason.

If the app never breaks at VRPose.js:153, then for some reason the WebXR calculation path is not being taken, and the camera parameters are being calculated through the "legacy" path (non-webxr, cardboard hardware and so on). Then the problem is about correctly detecting that the Quest Pro does provide a WebXR implementation.

rumicuna commented 10 months ago

Unfortunately it seems you will not be able to reproduce the problem by specifying a Quest Pro headset in the emulator. I just added console.log messages in VRPose.js:153 to provide dumps of the variables you specified, and changing what headset is used on the emulator has no effect and the variable values are always the same (this is even after restarting the debugging session, the WebXR simulator and clearing the cache). I imagined this might be the case before I tested because changing the headset used by the simulator had no effect on the visuals on the browser even though they should look different because the Quest 2 and the Quest Pro have different fov, etc., but printing out the variables confirmed it.

Next up, I'm installing the Meta Quest Developer Hub so that I may get the variables from the actual Quest 2 and Quest Pro headsets.

The info below is probably not useful, but I'm including what I get when selecting a Quest 2 and Quest Pro in the WebXR simulator:

xrView.eye:  left

xrView.transform.inverse.matrix Float32Array(16) [0.9996212124824524, 0.0068848105147480965, -0.026646755635738373, 0, -8.421133235181344e-10, 0.968204915523529, 0.25015830993652344, -0, 0.027521813288331032, -0.25006353855133057, 0.9678382277488708, 0, 0.020000001415610313, -0.12355826050043106, -0.031924132257699966, 1, buffer: ArrayBuffer(64), byteLength: 64, byteOffset: 0, length: 16, Symbol(Symbol.toStringTag): 'Float32Array']

xrView.projectionMatrix Float32Array(16) [0.6010186672210693, 0, 0, 0, 0, 1, 0, 0, 0, 0, -1.0002000331878662, -1, 0, 0, -0.20002000033855438, 0, buffer: ArrayBuffer(64), byteLength: 64, byteOffset: 0, length: 16, Symbol(Symbol.toStringTag): 'Float32Array']

xrPose.transform.inverse.matrix Float32Array(16) [0.9996212124824524, 0.0068848105147480965, -0.026646757498383522, 0, -3.763336864359701e-10, 0.968204915523529, 0.25015830993652344, -0, 0.02752181515097618, -0.25006353855133057, 0.9678382277488708, 0, 4.802612821319663e-11, -0.12355825304985046, -0.03192415460944176, 1, buffer: ArrayBuffer(64), byteLength: 64, byteOffset: 0, length: 16, Symbol(Symbol.toStringTag): 'Float32Array']

xrView.eye:  right

xrView.transform.inverse.matrix Float32Array(16) [0.9996212124824524, 0.0068848105147480965, -0.026646755635738373, 0, -8.421133235181344e-10, 0.968204915523529, 0.25015830993652344, -0, 0.027521813288331032, -0.25006353855133057, 0.9678382277488708, 0, -0.019999999552965164, -0.12355826050043106, -0.031924132257699966, 1, buffer: ArrayBuffer(64), byteLength: 64, byteOffset: 0, length: 16, Symbol(Symbol.toStringTag): 'Float32Array']

xrView.projectionMatrix Float32Array(16) [0.6010186672210693, 0, 0, 0, 0, 1, 0, 0, 0, 0, -1.0002000331878662, -1, 0, 0, -0.20002000033855438, 0, buffer: ArrayBuffer(64), byteLength: 64, byteOffset: 0, length: 16, Symbol(Symbol.toStringTag): 'Float32Array']

xrPose.transform.inverse.matrix Float32Array(16) [0.9996212124824524, 0.0068848105147480965, -0.026646757498383522, 0, -3.763336864359701e-10, 0.968204915523529, 0.25015830993652344, -0, 0.02752181515097618, -0.25006353855133057, 0.9678382277488708, 0, 4.802612821319663e-11, -0.12355825304985046, -0.03192415460944176, 1, buffer: ArrayBuffer(64), byteLength: 64, byteOffset: 0, length: 16, Symbol(Symbol.toStringTag): 'Float32Array']
pupitetris commented 10 months ago

OK, I would have to revisit the topic, but it may actually be a bug in WebXR, or Meta's web browser, etc. As far as I can recall, the demos use two cameras, configured directly from the WebXR parameters. To plug the multi-view to the Cesium pipeline, I did what the legacy code does: take the monoscopic camera of the scene, translate it for the left eye, render, and then translate it for the right eye and render again. The legacy code translates by semi-arbitrary amounts (an URL for the technique is provided in the code) on the "local" x-axis of the camera. What I do for WebXR is take a delta of the transformation of the xrPose with the transformation of the current (left or right eye) xrView, and apply that delta to the local transform of the Cesium camera (before transforming to the actual location/angles). This way I use the information that comes from the headset (including proyection matrix, from which the frustum is derived) and use those recommended parameters, per WebXR's recommendation. Also, the software supports an arbitrary number of xrViews, not just left/right, so it can render for example for a spectator camera.

What changes in the Quest Pro? Well, just like the Quest 3, the Quest Pro has two independent displays, one per eye. That could be making a difference.

So one option could be to manually force the use of the legacy camera translation, and see what happens there. Just to see if the Quest2 and the Pro look the same, both with legacy camera translation.

Since the transformations seem to look the same with both headsets (have to be checked with the HMD hardware, not the emulator), there is no change needed in the interpretation of the matrices on Cesium's side, and that's why I am suspecting there is a bug in WebXR or below that. I just don't see evidence for a need to change the code, unless it's a workaround. It is weird that the projection Matrix is the same, because the Quest Pro has a different FOV than the Quest 2...

rumicuna commented 10 months ago

So that I can test what you suggest on the Quest 2 and Pro, what is the best way to turn the legacy camera translation on? It sounds like you already have this working on the Quest 2, so I'm sure I'm just not enabling that option correctly.

I tried setting (in index.js line 100)

scene.useWebXR = false;

In which case I get just a black screen on the Quest 2. The Chrome simulator still seems to work ok, the only difference is that it only displays one image instead of two side by side (one for each eye) even with the "stereo" button selected.

I also tried without luck commenting out these lines so that the legacy function is called.

  //if (pose.isWebXR) {
  //  return applyPoseParamsToCameraXR(pose, params, camera);
  //}
  // Plain old WebVR.
  return applyPoseParamsToCameraLegacy(pose, params, camera);
rumicuna commented 10 months ago

Question: is there a way to set the camera position on both eyes to be the exactly same (with not just the same position, but the same pointing angle)? I would lose the stereo effect, but maybe this would make the Quest Pro usable? I assume the warping effect would still be wrong on the Pro, but I'd like to see if this works. I tried setting the Interpupillary Distance scalar to zero (CesiumVR.js line 72):

    this.IPDScale = 0;

Related to this, something strange I noticed with the Quest Pro is that it doesn't seem like the problem is with the eye distance, it seems like the eye position is ok, but that they are just pointing in different directions (looks just like what the real world looks like if I cross my eyes). Maybe around 30 degrees difference in the direction each eye is looking at. This is why I ask above about setting the angle in addition to the position (as it seems the position might not be the issue).

pupitetris commented 10 months ago

This setting is for the legacy code only. You would have to force legacy camera calculations first. Too bad I am really busy now and I don't have the time to change the code so that it is easy to turn on and off different features.

pupitetris commented 10 months ago

So that I can test what you suggest on the Quest 2 and Pro, what is the best way to turn the legacy camera translation on? It sounds like you already have this working on the Quest 2, so I'm sure I'm just not enabling that option correctly.

I tried setting (in index.js line 100)

scene.useWebXR = false;

In which case I get just a black screen on the Quest 2. The Chrome simulator still seems to work ok, the only difference is that it only displays one image instead of two side by side (one for each eye) even with the "stereo" button selected.

I also tried without luck commenting out these lines so that the legacy function is called.

  //if (pose.isWebXR) {
  //  return applyPoseParamsToCameraXR(pose, params, camera);
  //}
  // Plain old WebVR.
  return applyPoseParamsToCameraLegacy(pose, params, camera);

The screen goes black because useWebXR is needed to be set for the Cesium buffer to be copied to the HMD's hardware buffer.

So yes, force the legacy camera calculations while making sure that isWebXR is still true so that the bitblit onto the HMD buffer is done. The bitblit takes place here: Scene.js:2951. Make sure the preconditions are met so that this step takes place.

Also force the legacy pose param preparation as well, here: VRPose.js:422