oculus-samples / Unity-Movement

Body, Eye and Face Tracking code sample.
Other
244 stars 42 forks source link

How to set up Movement SDK for multiplayer ? #71

Open ramkeshcse084 opened 2 months ago

ramkeshcse084 commented 2 months ago

image I am currently using Fusion 2 for multiplayer functionality in our project. The OVR rig is enabled for the local player and disabled for remote players. However, I encountered an issue where, upon instantiating a new player, the Retargeting Processor was empty. To address this, I saved the Retargeting Processor in the asset folder, which resolved the issue and made it visible.

However, a new problem arose when another player joined the session: my player's view started displaying in 2D, as if it were not a VR application. After investigating, I believe the solution may involve enabling the movement SDK components locally and disabling them for remote players.

Could you please provide guidance on how to enable and disable movement SDK components on avatars? Additionally, do we need to perform any specific setup for multiplayer functionality?

Your assistance in resolving this matter and ensuring smooth multiplayer functionality would be greatly appreciated.

Thank you

andkim-meta commented 1 month ago

Hello, we can provide some advice for setting up Movement SDK for multiplayer. We suggest networking the final pose after retargeting and constraints, and optimizing the serialized pose before sending it. Additionally, consider implementing interpolation when applying the received pose to the receiving client. You can remove all other components except for the one responsible for deserializing the data on the player's side and simply update the character with the received pose. This would mean disabling the following components:

and the Rig (game object) in the hierarchy.

ramkeshcse084 commented 1 month ago

https://github.com/oculus-samples/Unity-Movement/assets/25951438/7c017241-9d61-42e9-980a-62157427be7a

Hello Thank You for reply . As i shared in start of post it was showing 2d video . I disable and set active false for these image . The view is now looking good. I will focus only on the objects you mentioned and check their behavior. However, I'm currently facing a challenge with synchronizing the Meta Movement tracking across the network. For reference, you can check the video where the animation is syncing correctly. I want to sync both locomotion and body tracking over the network. In your previous response, you mentioned deserializing data on the player's side and updating the character, but I'm not entirely clear on how to implement this. How can I manage locomotion over the network as well? The locomotion is controlled by the PlayerController, which contains the Movement SDK Locomotion script and several child objects that manage locomotion. Here is the reference image. image

Please guide us . We are not getting much more on internet regarding this.

Thank You

andkim-meta commented 1 month ago

Hello, if you want to synchronize animation and locomotion across a network, you can still follow the approach previously described. To do this, you would network the final pose after retargeting, constraints, locomotion, and animation. You would then send this pose so that it can be applied to the receiving character. It's recommended that the character on other clients shouldn't have any logic beyond deserializing the final pose data. For serializing and deserializing the pose data, you would compress the pose information and transmit it over the network; afterwards, on the receiving client, you would decompress the data and implement the pose (including position and rotation for each skeletal bone).

ramkeshcse084 commented 1 month ago

Hello,

Thank you for your guidance. I followed the same approach to get all skeletons' positions and rotations, deserialize them, send them remotely, then receive and serialize them before applying. However, when I join as a multiplayer, the screen goes black, and the FPS drops significantly in Meta Quest 3. I think this might be due to a lot of data being transferred and received in a single tick.

What do you think about it? Am I missing something?

Thank You again !

ramkeshcse084 commented 1 month ago

https://github.com/oculus-samples/Unity-Movement/assets/25951438/c5c67d9c-2696-4221-b37e-2094e59fc661 image

I am sharing a video and screenshot for reference. Could you please provide information about PlayerController that assists with locomotion? I have added a Network Transform, and for both local and remote players, it shows animations and movement as depicted in the video. However, the issue arises when joining as a client; the local and remote players' movements and rotations are overlapping each other. When I join as a host, only the rotation is applied to the other player in the scene with me.

Additionally, I need clarification on whether we have to sync the bones over the network. Initially, I tried doing this through code, but it had no effect and just caused a drop in FPS. I have now attached NetworkTransform to the skeleton's head, left hand, and right hand. We are using Fusion 2. Kindly reply as soon as possible. Thank You

ramkeshcse084 commented 1 month ago

Please reply .

ramkeshcse084 commented 1 month ago

When you say bones final pose you means synching armature position and rotation on network ? image I write script to do this thing but there was no effect even when changing position and rotation manually in Unity editor it is not changed . When i removed animator then it is changing manually but animation stop working on remote player.

ramkeshcse084 commented 1 month ago

Please reply asap . We are in deadline.

ramkeshcse084 commented 1 month ago

Will this script provide bones final pose ? image

andkim-meta commented 1 month ago

Hello, like mentioned previously, only the host character should have logic to drive the character. The client character should not contain any logic other than applying the pose that it receives over the network, which means that components that drive locomotion (such as the PlayerController) shouldn't be present on the client character. The pose that should be sent from the host should be the pose after all updates to the skeleton are done. To get the final pose, take the position and rotation of all of the bone transforms in the skeleton (i.e. Hips, Spine, Chest, UpperChest etc.) after all modifications to the skeleton are done (end of late update).

The pose data should be compressed and sent at a reasonable update rate, using networking optimization techniques. An article that explains some techniques can be found here: https://gafferongames.com/post/snapshot_compression/.

To summarize, the host performs all logic to make the character move, while the client has only the skinned mesh that is driven by the result of the host's work.

ramkeshcse084 commented 2 weeks ago

https://github.com/oculus-samples/Unity-Movement/assets/25951438/0b0d26e8-ee11-4a10-a316-7d9f20391a75

Hello,

Thank you for your guidance. With your help, I have successfully synchronized the bones using Fusion 2. I am currently utilizing the standard method for data transfer in Fusion 2.I am sending Final Pose from Late Update . I am now planning to explore further optimizations using this snapshot compression technique. According to the Fusion 2 documentation, some level of optimization is already handled.

As demonstrated in the attached video, there are some issues during locomotion. Do you have any advice on how to address this?

Additionally, I received a reply regarding other issues that have been resolved in the latest version of the Meta Movement SDK. How can I update to this new version?

Thank you again for your support.

andkim-meta commented 2 weeks ago

Hello, if the issues with networking are resolved, we suggest closing this issue and opening a new issue for locomotion issues. In regards to the bug where the animation does not match the motion of movement, we addressed this in #dev as mentioned in https://github.com/oculus-samples/Unity-Movement/issues/75, however the fix applies to the embodied character in the sample. To get the latest fixes, please update the package in the Unity Package Manager. If the current release on main doesn't contain the commits with the fixes in the release notes, please grab the package from the #dev branch described in the README using the following git URL: https://github.com/oculus-samples/Unity-Movement.git#dev.

sohailshafiiWk commented 2 weeks ago

In regards to the bug where the animation does not match the motion of movement, we addressed this in #dev as mentioned in https://github.com/oculus-samples/Unity-Movement/issues/75, however the fix applies to the embodied character in the sample. Can you see if it works for you or not?

ramkeshcse084 commented 2 weeks ago

When i updated from this url https://github.com/oculus-samples/Unity-Movement.git#dev sending me older version image image as show in screenshots. Now going to add as normal i had installed .

ramkeshcse084 commented 2 weeks ago

I updated normally on latest version, Locomotion script looks some compile error . I checked in Fresh Project on Locomotion Sample scene. Locomotion is not working. We can continue this discussion in Locomotion thread .

Uploading com.oculus.vrshell-20240615-152020.mp4…

![Uploading image.png…]()