Closed MackDoesDev closed 2 years ago
Thanks for reporting it.
Unfortunately I don't have a Vive Pro Eye or other stereo camera Vive to test with. I suspect the issue stems from it using a vertical frame layout. The OpenVR API doesn't have any documentation on how the frames are arranged.
If possible, could you capture an image of the raw camera output, for example though the Camera application in Windows? It would help a lot in determining the frame transformation needed.
Sure, below you find the camera output from the SteamVR camera test. The upper image is the right camera.
Okay, after some experimentation, I found the right changes to make it work correctly.
First: FSteamVRPassthroughRenderer::DrawPostProcessMatPassthrough_RenderThread
ln 472: PassParameters->FrameUVOffset = FVector2D(0, 0.5);
ln 486: PassParameters->FrameUVOffset = FVector2D(0, 0);
Second: FSteamVRPassthroughRenderer::GetTrackedCameraUVTransform
TransformToCamera = CameraProjectionInv * FrameCameraToTrackingPose * MVP;
regardless of CameraId
I know, you can't just change this as it breaks the code for other headsets, but maybe it helps with solving this issue. Unfortunately, I'm not experienced enough with all this stuff yet, to come up with a generalized solution.
Thanks, I'll see about implementing the fix when I get energy.
I would have thought the first fix would be enough. With a horizontal frame layout, FrameCameraToTrackingPose
returns the position for the left camera, and CameraLeftToRightPose
provides an additional transform to the right one. The only way I could see the second fix working is if the transform is baked into the camera projection matrix. You might want to verify that the passthrough view lines up with the SteamVR room view.
Thanks for the hint. I just checked and the Room View kind of lines up. "Kind of", because it only uses the left camera image on both eyes. But I did a rough check between the images for each eye and the real world and they are lining up just fine.
I'm not sure what exactly is wrong with the CameraLeftToRightPose
, but it's the reason the image on the right eye was mirrored. So I assume, it contains a negative x-value somewhere. If I find the time tonight, I will log out the pose data and post them here.
It's calculated form the result of FSteamVRPassthroughRenderer::GetTrackedCameraEyePoses()
, which queries the Prop_CameraToHeadTransforms_Matrix34_Array
property from SteamVR. It's possible the Vive driver doesn't support the property and returns invalid matrices. A fallback to it working might be to get the singular Prop_CameraToHeadTransform_Matrix34
property instead, and mirror it for the second camera.
The function should log a warning with "GetTrackedCameraEyePoses error" if SteamVR doesn't have the property. It might fail without any error though, in case the driver only populates the left matrix.
I updated the code with the fixes, and added some extra error checks and fallbacks to retrieving the matrices.
If the views still don't line up, check if the plugin prints any warnings or errors to the console. If you can, debug and set a breakpoint after the call to GetTrackedCameraEyePoses()
and post the values of the matrices in LeftCameraPose
and RightCameraPose
.
Hey, sorry I didn't react earlier. Got a bit busy over the last days. But thanks for the fix. I will check it out tonight and report back.
Alright, I just checked it and the images are now on the right eyes. However, the image on the right side is still mirrored. Here are the values of the different matrices.
LeftCameraPose = {
{0.999981999, 0.00235483469, -0.00551767414, 0},
{-0.00230989303, 0.999964237, 0.00813729688, 0},
{0.00553663867, -0.00812440459, 0.99995166, 0},
{0.0325223804, 0, -0.0500000007, 1}
}
RightCameraPose = {
{0.999815344, 0.00579668395, -0.0183229316, 0},
{0.00558570167, -0.999923408, -0.011544968, 0},
{-0.0183883477, 0.0114404894, -0.999771118, 0},
{-0.0323868692, -0.00038680865, -0.0499831513, 1}
}
And for good measure ...
CameraLeftToHMDPose = {
{0, 0, 0, 0},
{0, 0, 0, 0},
{0, 0, 0, 0},
{0, 0, 0, 0}
}
CameraLeftToRightPose = {
{0, 0, 0, 0},
{0, 0, 0, 0},
{0, 0, 0, 0},
{0, 0, 0, 0}
}
So the issue with the mirrored image probably stems from the multiplication by 0 in CameraLeftToRightPose.
The latter two matrices are calculated from the former two after the function call, and the former look like they would be valid sources for them. I don't that would be the issue.
There are two issues I can see in the matrices. The first is that the x-axis translation is inverted from what it should be. This likely means that the left and right eye matrices are swapped, which would at least mean it's consistent with the frame order. The second issue is that the left (formerly right) matrix has negative scale factors on two of the axes.
I updated the plugin code, swapping the matrices and removing the negative scale factors. I don't know if the latter will affect rotation of the image though.
I will check it out later and if it doesn't work right, I will try to learn a bit about the whole matrix calculations. Maybe I can actually figure out, what's going on there. If I can't figure it out myself, we could also do a pair session at some point, if you are interested, so you could have direct access to the headset.
Alright, I just checked it out. It works perfectly now. Thanks for the persistence.
I ran your example project with my Vive Pro Eye and it seems that the assignments of the cameras are off. I see the image of the right camera on the left eye and vice versa.
Also the image and movement on the right eye are mirrored.