EpicGames / PixelStreamingInfrastructure

Moved to: https://github.com/EpicGamesExt/PixelStreamingInfrastructure
https://github.com/EpicGamesExt/PixelStreamingInfrastructure
MIT License
545 stars 258 forks source link

Add experimental support for WebXR based experiences. #73

Closed Belchy06 closed 1 year ago

Belchy06 commented 1 year ago

As of this commit, Pixel Streaming supports VR streamed experiences.

However, as of 7th of February 2023, the latest reference Epic Games implementation doesn't support these experiences.

Adding support for these experiences will require:

Known limitations:

Belchy06 commented 1 year ago

Re-opening as we need to utilise multiple streamers to improve the experience

SaibotC commented 1 year ago

@Belchy06 Super excited to see this! I can report that it's working well for me on Quest 2, but on Quest Pro, it seems like the HMD's IPD is wrong, or somehow looks super crosseyed. Any suggestions on settings I can try to fix it? Sorry if I should have started a seperate thread for this.

Edit: and also, on the Quest 2, the FOV in webxr viewer seems to be much wider(a bit unnatural feeling) compared to the FOV of a native compiled apk app running in the headset.

Belchy06 commented 1 year ago

@SaibotC The IPD is currently a hard-coded value in the PixelStreamingHMD module of the PixelStreaming plugin.

From my understanding, the WebXR API doesn't provide information regarding the device's IPD, so for systems with a variable IPD (like the HTC Vive) it may be difficult to properly propagate this information back to the application.

For systems with a fixed IPD, I'm sure we could modify the C++ code in the plugin to check for the active system and modify the IPD accordingly.

Edit: Regarding your edit, the FOV is also hard-coded. Something that should definitely be sent back to the application.

Tokix commented 1 year ago

@Belchy06 testing this version I'm currently getting the following message in my Browser-Log when I'm in Browser in WebXR Mode:

"Msg: Attempted to send a message to the streamer with message type: XRHMDTransform, but the frontend hasn't been configured to send such a message. Check you've added the message type in your cpp"

I suppose I need to implement a message Listener in my Unreal Project do you have any hint or example on how to do this?

Mouse and keyboard inputs work fine.

Thanks a lot beforehand

Belchy06 commented 1 year ago

@Tokix What version of the unreal engine are you using?

You'll need to be using a source build of the 5.2 branch from here.

You'll also need to be launching your application with '-PixelStreamingEnableHMD' as well as your other relevant launch args.

Tokix commented 1 year ago

@Tokix What version of the unreal engine are you using?

You'll need to be using a source build of the 5.2 branch from here.

You'll also need to be launching your application with '-PixelStreamingEnableHMD' as well as your other relevant launch args.

Thanks for the fast reply, ok thats it we used 5.1 will try build from source and add your parameter. Thanks alot for your help :)

Tokix commented 1 year ago

We tried to build Unreal 5.2 from source on Windows

but received the following errors:

Fehler MSB3073 Der Befehl "..\..\Build\BatchFiles\Build.bat Unsync Win64 Development -WaitMutex -FromMsBuild" wurde mit dem Code 6 beendet. Unsync C:\Program Files\Microsoft Visual Studio\2022\Enterprise\MSBuild\Microsoft\VC\v170\Microsoft.MakeFile.Targets 44

Furthermore there was a DirectX which could be fixed by providing the correct dll path.

It is the same error as descibed here https://forums.unrealengine.com/t/errors-building-ue5-2-and-5-2-preview-1/775605/3 We tracked down the issue a bit more and added it to the forum thread for further analysis.

We are setting up a linux build to see if we can build there sucessfully.

-> Issue is now fixed and it builds fine for VS2022 when applying the fixes mentioned in https://forums.unrealengine.com/t/errors-building-ue5-2-and-5-2-preview-1/775605/3

Merge request is in progress

lukehb commented 1 year ago

Okay, let us know how you go with the VR feature now you have a working build.

ffreality commented 1 year ago

But what about required hardwares ? If we speak about streaming from desktop PC, only one user will be able to control each app instance. So it will be less network consuming version of Quest Link.

But how about Nvidia Tesla hardwares ? If we rent Azure or AWS GPU instances, can we run pixel streaming hmd applications without CloudXR licenses ?

If answer is yes, this will be a game changer.

Belchy06 commented 1 year ago

The idea with this feature is that you'll be able to use your standard GPU enabled instances (eg g4dn.xlarge or similar) to stream out your XR experiences without the need for CloudXR.

Belchy06 commented 1 year ago

For those that have tested this so far you may have noticed that if you tilted your heard side-to-side (roll), the image would have had some weird distortion.

This has since been fixed.

lukehb commented 1 year ago

See https://github.com/EpicGames/UnrealEngine/commit/5112ae5414c15097185812ed49e68c25418aa8d4

Belchy06 commented 1 year ago

Closing this issue as multi streamer support has landed in 5.2.

See https://youtu.be/OHZ7rZc90ig on how you can setup multiple cameras.

If you're looking to stream the spectator camera from the VR template, you want to follow everything Alex is doing, but use the SpectatorCamera instead of adding a new SceneCapture2D.

Tokix commented 1 year ago

Hi @Belchy06 and @lukehb as the Prerelease 5.2 is now available I tested the feature again with that release against Pixelstreaming Tag UE5.2-0.6.0 on Oculus Quest 2. The streaming and activation of xr mode is working and moving with the controllers as well.

The movement of the headset and the controller position is not transmitted with that combination do you have any hints to get it working again?

My "Additional Launch Parameters" argument for local testing currently looks like this:

-vr -AudioMixer -PixelStreamingIP=localhost -PixelStreamingPort=8888 -PixelStreamingEnableHMD -AllowPixelStreamingCommands=true

Also building the current ue5-main branch again and will test there again in debugging. Engine/UE5/Plugins/Media/PixelStreaming/PixelStreamingHMD is the startingpoint i suppose.

In any case thank you both for the feature

Belchy06 commented 1 year ago

If you're going to be debugging, I'd suggest chucking a breakpoint in this line https://github.com/EpicGames/UnrealEngine/blob/ue5-main/Engine/Plugins/Media/PixelStreaming/Source/PixelStreamingInput/Private/PixelStreamingInputHandler.cpp#L874

If your breakpoint hits, then you can be confident that the browser is transmitting the HMD position and orientation to UE. If the breakpoint doesn't hit, let me know as it may end up being an issue on the frontend.

lombre33 commented 1 year ago

Hi guys !

Trying this new feature on the Meta quest 2 I still have the issue described earlier by @Belchy06. In the webXR experience the 2 images don't blend perfectly and it's giving the impression of squinting. Any idea of what could be wrong ?

lukehb commented 1 year ago

At a guess, hard coded ipd

On Wed, 31 May 2023, 5:13 pm lombre33, @.***> wrote:

Hi guys !

Trying this new feature on the Meta quest 2 I still have the issue described earlier by @Belchy06 https://github.com/Belchy06. In the webXR experience the 2 images don't blend perfectly and it's giving the impression of squinting. Any idea of what could be wrong ?

— Reply to this email directly, view it on GitHub https://github.com/EpicGames/PixelStreamingInfrastructure/issues/73#issuecomment-1569624283, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAJIYXR6YXEQNYUEJ65JFULXI3VQ5ANCNFSM6AAAAAAUTS6IVE . You are receiving this because you were mentioned.Message ID: @.***>

brycelynch commented 1 year ago

@lukehb @lombre33 @SaibotC :

Admittedly I'm out of my depths here on the math behind these IPD/FOV calculations in PixelStreamingHMD.cpp, but I can confirm that the way these are hard coded allow the Quest 2 to to resolve a stereoscopic image, but the FOV is still rather extreme and not what you'd expect in a modern packaged VR experience or even WebXR rendering. Overall the immersion is really swimmy and arguably worse than the good old DK2 days.

I don't think the problem is the IPD calculation (EyeOffset = 3.20000005f) based on this old post by Nick Whiting (https://forums.unrealengine.com/t/emulatestereo-command-not-working/307039/19).

For Quest Pro, which doesn't work with these hard coded numbers at all, I was able to get something close to Quest2 with the convergence (still not great) by doubling the value of the ProjectionCenterOffset to 0.303952842. However, the FOV is still way to extreme, but reducing the HalfFov value by 2.5f makes its a little better.

Again, I have NO IDEA what I'm doing with the math behind these calculations, but it would be great if we could get someone that understands these calculations to give us something that represents the Quest2 and Quest Pro convergence and FOV a little better.

Screenshot_6

Starkium commented 1 year ago

would there be a way to use this kind of technology to run the xr pixel stream on an Xbox to a quest 2 for example? What would be the requirements to get over this hurdle? This could be pretty big for the xr space for consumers who don't have a strong PC and don't want to buy into the PlayStation ecosystem.

lukehb commented 1 year ago

@Starkium Please open new issue as a question instead of hijacking this one. Thank you.