Closed serhan-gul closed 2 years ago
Hi @serhan-gul,
Sorry for the late response.
You can get the latest eye pose every frame without the need of custom data channels, on the remote side by using the API directly similar to what this code snippet does in SampleRemote and use timestamp = now(). https://github.com/microsoft/MixedReality-HolographicRemoting-Samples/blob/cfc895f1ebd8ec46c5e82f91c4bbe2bd7c181184/remote/common/holographic/SpatialInputRenderer.cpp#L48-L61
Hi @Hanaae-MSFT. Thanks for your response. I'm aware of SpatialPointerPose
but this only works for UWP, right? With a Unity-based remote (using the C# API) this doesn't work for me, both for standalone Win app and Play mode remoting.
Hi, you would need to import .Net WinRT projections in your Unity-based remote project. you can also check the following links for more information about that: https://docs.microsoft.com/en-us/windows/mixed-reality/mrtk-unity/features/tools/holographic-remoting?view=mrtkunity-2021-05#import-dotnetwinrt-into-the-project https://docs.microsoft.com/en-us/windows/mixed-reality/develop/unity/using-the-windows-namespace-with-unity-apps-for-hololens
Thanks for the information. Is there a way to get a SpatialCoordinateSystem
on the remote side in this case? I'd need that to get a spatialPointerPose
by calling :SpatialPointerPose::TryGetAtTimestamp
.
Tried the answer here , which uses the OpenXR plugin. It works if the app is deployed on HL but returns null for Remoting.
currentSpatialCoordinateSystem = PerceptionInterop.GetSceneCoordinateSystem(UnityEngine.Pose.identity) as SpatialCoordinateSystem;
Also, the MRTK type WindowsMixedRealityUtilities.SpatialCoordinateSystem
doesn't work.
EDIT: Found some info in the docs here but this is for native apps, I'm trying to do that in Unity/C#.
Hi, you need to decide whether you want to use the Windows Mixed Reality API or OpenXR API with remoting for your app. And yes, there might be some interoperability mechanisms between OpenXR and Windows Mixed Reality objects, especially the ones used for spatial locatibility when running natively on a HL, but we don't support this via remoting. Holographic Remoting doesn't support mixing between different runtimes.
As for the Windows Mixed Reality API, what you could try via the Unity's C# script, is to get the SpatialCoordinateSystem which corresponds to the actual coordinate system that Unity has created, something like:
UnityEngine.XR.WindowsMR.WindowsMREnvironment.OriginSpatialCoordinateSystem
You can also check out the documentation of Unity Windows XR plugin for more information.
Hi Remoting team. I was wondering if it is possible/makes sense to send the eye tracker from HL2 using a data channel in the custom player. The problem when I try to do this on the remote side (Unity) is that, the MRTK interfaces like
EyeGazeProvider
seem to be tied to the Unity's rendering frame rate and this causes missing lots of gaze samples depending on the fps variation. So I thought maybe using the UWP APIInput.Spatial.SpatialPointerPose.Eyes
on the client side can enable obtaining the data at a fixed frame rate 30 Hz regardless of the rendering load of the remote.