microsoft / MixedReality-HolographicRemoting-Samples

Code samples for using Microsoft's Holographic Remoting library.
Other
141 stars 47 forks source link

Coordinate System Synchronization with Holographic Remoting App in Unity #65

Closed SWAGamaz closed 8 months ago

SWAGamaz commented 2 years ago

Good day. I have a remote unity app (openxr) and a player with space marker recognition.

When I pass the position of the marker relative to the main camera of the Hololens 2, it is projected into the unit with some offset. As if an offset is added relative to the eyes and the main camera.

It turns out that the position of the main camera in the player is not equal to the position of the camera in Unity.

I tried using the built-in QR recognition function and it worked fine with the exact overlay in the remote app. But I can't see its implementation.

Here is how to synchronize the system for the player and the remote application. But it is not clear how to do this in Unity and is it possible to do this?

yl-msft commented 2 years ago

@SWAGamaz. If you want to use QR marker to sync location, you might want to use object to marker transform, and ignore the marker to main camera's transform.

The QR sample you pointed above is to locate QR marker on remote Unity app. If you have a the QR marker code on your custom player app as well, you can use this QR marker to correlated the remote app to your player app. You do need to be mindful about the coordinate system convention since the remote app is following unity convention where your custom player app is likely a WinRT app.

However, if you just want to synchronize the coordinate system between remote and player app, using the Coordinate System Synchronization you linked there is the right way to do this, and you don't even need QR code to do this sync.

This coordinate system sync is a new API on MR openxr plugin and this work is in progress and planned in our next API release 1.5 in the coming weeks.

SWAGamaz commented 2 years ago

@yl-msft, Thank you very much for your advice. Let's try the proposed options and look forward to updating the API.

And please tell me if I create SpatialGraphNode on the player side and bind it to the coordinate system SpatialCoordinateSystem qrCoordinateSystem = SpatialGraphInteropPreview::CreateCoordinateSystemForNode(code.SpatialGraphNodeId()); as in the example here. If I pass the Guid of this binding, can I get it on the Unity side using the var node = SpatialGraphNode.FromStaticNodeId(Id) function and then get the Pose using node.TryLocate(FrameTime.OnUpdate, out Pose pose)?

SquareRoundCurly commented 2 years ago

Hi! I'm having a similar issue, I have a Unity remoting app, and a custom remoting player on the HL2. I retrieve sensor data on the player using mediaCapture, as well as ResearchMode. Then, I would like to do image processing on the remote side, using OpenCV, for computer vision.

When in a non-remoting context, I can access both the Unity spatial coordinate system and the sensor coordinate system. However, in remoting, I need to transfer one to the other.

While I managed to pass around coordinate systems using the spatial anchor transfer manager, I can not use that in a in-editor remoting setting (where winRT types and methods are not available).

Same goes for the coordinate system synchronization. All the methods provided are only available with winRT.

How would one approach coordinate system sincronization/transfer with unity editor remoting in mind? I'm wondering how the QR service can reason about player-detected markers in remote unity space.

ccrop commented 1 year ago

Hi! I have the same problem.
1、a Custom player in HL2. use [MixedReality-HolographicRemoting-Samples] player uncomment #define ENABLE_USER_COORDINATE_SYSTEM_SAMPLE in SamplePlayerMain.h 2、Unity use 2021.3.11 Unity OpenXR Plugin 1.8.1 UnityEngine.Pose pose = UnityEngine.Pose.identity; Microsoft.MixedReality.OpenXR.Remoting.ConnectionState connectionState; Microsoft.MixedReality.OpenXR.Remoting.DisconnectReason disconnectReason; Microsoft.MixedReality.OpenXR.Remoting.AppRemoting.TryGetConnectionState(out connectionState, out disconnectReason); if(connectionState == Microsoft.MixedReality.OpenXR.Remoting.ConnectionState.Connected) { Debug.Log("###connected"); bool b = Microsoft.MixedReality.OpenXR.Remoting.AppRemoting.TryLocateUserReferenceSpace(Microsoft.MixedReality.OpenXR.FrameTime.OnUpdate, out pose);

            if(b) { text.text = "OK use space:"+pose.position.ToString(); }
             else
            {
                text.text = "no use space:"+ pose.position.ToString(); //  no use space
            }
            Debug.Log("######" + pose.position.ToString());
        }

in unity "no use space: 0,0,0 How do I make it work?

ccrop commented 1 year ago

@yl-msft Hi!

chairobl commented 8 months ago

Closing the issue as there hasn't been activity on it in a while and the original issue seems to have been resolved. @SquareRoundCurly , @ccrop if you are still encountering issues feel free to open new tickets. For questions specific to the Unity plugin consider opening the ticket in the Open Xr Unity Plugin Samples repository :)