Closed tejaswiniiitm closed 2 years ago
Hi @tejaswiniiitm
I have a script in the scene that when the user presses the Right Controller's B button, it takes their current height and scales the avatar's height accordingly. So you could ask users while standing to press that button. The motion should work for different users' heights, however, if the height deviates too much, there may be self-intersections or, for instance, an adult locomotion sequence may not make sense when applied to children. For those cases, I would prepare different MotionMatchingData with different recorded animations and select them based on the user.
You can find the creation of the trackers database for training in the following script: https://github.com/UPC-ViRVIG/MMVR/blob/main/MMVR/Assets/DirectionPrediction/TrackersDataset.cs
. Line 32 defines the offsets (local space of the trackers).
You can find it in the VRCharacterController, the property Max Distance Simulation Bone
I guess you want to use third person character for multiplayer/collaborative. Yes, the best solution would be to store the bones and transforms as you said, and then send them to other PCs using some multiplayer library (like Netcode for GameObjects). I would apply it in the LateUpdate(), just in case you are using some Unity's animation features, it will override bone transforms applied in Update().
Hope it helps ! :)
Okay, got it. Thankyou for clearing all my doubts patiently!
Hi @JLPM22 ,
OnSkeletonTransformUpdated
method ofMotionMatchingSkinnedMeshRenderer
to some json file, and continuously apply them to the other third person character in update() method. Am I correct? Or is there any better way, or the scripts/tool which you have already written for third person character motion?Thank you!