EpicGamesExt / PixelStreamingInfrastructure

The official Pixel Streaming servers and frontend.
MIT License
138 stars 47 forks source link

Pixel Streaming HMD -- Convergence is not working as expected #7

Open DenisTensorWorks opened 4 months ago

DenisTensorWorks commented 4 months ago

Originally opened by @brycelynch

When enabling pixel streaming plugin in 5.2 and launching a stream with HMD enabled, the stereo convergence rendered in headset is really distorted and not resolving correctly. Streamer infrastructure is working great (nice job!) and really easy to setup but I'm wondering if there is something not configured correctly for stereo view, or if I've missed something. Tested using Quest Pro.

*On Quest 2 the image converges but the FOV is incorrect.


Moved from https://github.com/EpicGames/PixelStreamingInfrastructure/issues/242

DenisTensorWorks commented 4 months ago

Comment by @lukehb

We would be willing to accept/review a PR engine side or frontend side to speed the issue along, but we have been unable to get any internal engineering allocation on this so far due to other things being higher priority. Personally, I would love to get this solved, but we just haven't had the time/approval we need to do so.

For those interested in jumping in to solve the issue I do believe the issue is likely on engine side in the PixelStreamingHMD module that has many hardcoded values and needs to more correctly render an image for the VR headset which is connected on the browser side. It will probably involve sending a custom datachannel payload from the browser to UE to configure the IPD and such values.

DenisTensorWorks commented 4 months ago

Comment by owu-1

Better hardcoded values in the engine module for the Quest 2 are

const float ProjectionCenterOffset = 0.09f;
const float HalfFov = 1.6f / 2.f;

OpenXR vs Default Pixel Streaming HMD

Comparison OpenXR in grayscale. Default Pixel Streaming HMD in colour.

OpenXR vs

const float ProjectionCenterOffset = 0.09f;
const float HalfFov = 1.5f / 2.f;  // 1.6f is better but I only have an image for 1.5f
Comparison OpenXR in grayscale. Pixel Streaming HMD with better hardcoded values in colour.

You can play with whatever determines ViewportSize.X to try and make the image wider.

diff --git a/Engine/Plugins/Media/PixelStreaming/Source/PixelStreamingHMD/Private/PixelStreamingHMD.cpp b/Engine/Plugins/Media/PixelStreaming/Source/PixelStreamingHMD/Private/PixelStreamingHMD.cpp
index 3735afe..57279ef 100644
--- a/Engine/Plugins/Media/PixelStreaming/Source/PixelStreamingHMD/Private/PixelStreamingHMD.cpp
+++ b/Engine/Plugins/Media/PixelStreaming/Source/PixelStreamingHMD/Private/PixelStreamingHMD.cpp
@@ -110,7 +110,7 @@ void FPixelStreamingHMD::DrawDistortionMesh_RenderThread(struct FHeadMountedDisp
        FRHICommandListImmediate& RHICmdList = Context.RHICmdList;
        const FSceneViewFamily& ViewFamily = *(View.Family);
        FIntPoint ViewportSize = ViewFamily.RenderTarget->GetSizeXY();
-       RHICmdList.SetViewport(0, 0, 0.0f, ViewportSize.X, ViewportSize.Y, 1.0f);
+       RHICmdList.SetViewport(0, 0, 0.0f, ViewportSize.X + 100, ViewportSize.Y, 1.0f);  // Increase amount of picture for eyes. Though modifying this here offsets the right eye so it doesn't do any good

        static const uint32 NumVerts = 4;
        static const uint32 NumTris = 2;
@@ -181,10 +181,10 @@ void FPixelStreamingHMD::CalculateStereoViewOffset(const int32 ViewIndex, FRotat

 FMatrix FPixelStreamingHMD::GetStereoProjectionMatrix(const int32 ViewIndex) const
 {
-       const float ProjectionCenterOffset = 0.151976421f;
+       const float ProjectionCenterOffset = 0.09f;  // Fix eyes converging too far. A straight line is now straight
        const float PassProjectionOffset = (ViewIndex == EStereoscopicEye::eSSE_LEFT_EYE) ? ProjectionCenterOffset : -ProjectionCenterOffset;

-       const float HalfFov = 2.19686294f / 2.f;
+       const float HalfFov = 1.6f / 2.f;  // Fix fov
        TSharedPtr<SWindow> TargetWindow = GEngine->GameViewport->GetWindow();
        FVector2f SizeInScreen = TargetWindow->GetSizeInScreen();
        const float InWidth = SizeInScreen.X / 2.f;

The workflow to achieve the updated values being:

DenisTensorWorks commented 4 months ago

Issue is reported to be present on Apple Vision Pro as well (see issue above).

On Apple Vision Pro, the left and right display images are rendered fine on their own, but when viewed through the AVP with both eyes, the resulting image is distorted.

Environment: Apple Vision Pro/Safari/Ubuntu
PhDittmann commented 3 days ago

The message XREyeView was introduced in UE5.4.2, but is not send in PixelStreamingInfrastructure.

PhDittmann commented 3 days ago

I made a draft, which works at least with Quest 3, but I've a few questions:

lukehb commented 9 hours ago

I am doing some testing with various community members right now, but I think I have a fix in hand. If you'd like to try it all out:

PSInfra

UE

Headset

Would be really good if you can try this out on MQ3 and AVP.

PhDittmann commented 4 hours ago

Hi @lukehb , thank you for all the work you've done so far!