microsoft / MixedRealityToolkit-Unity

This repository is for the legacy Mixed Reality Toolkit (MRTK) v2. For the latest version of the MRTK please visit https://github.com/MixedRealityToolkit/MixedRealityToolkit-Unity
https://aka.ms/mrtkdocs
MIT License
6k stars 2.12k forks source link

OpenXR/Vive: different hand models in play and build mode #10297

Closed vogr closed 9 months ago

vogr commented 2 years ago

Describe the bug

When using MRTK with an HTC Vive (SteamVR) using the OpenXR plugin, the hand visualizer used in play mode is the one specified in the MRTK options, but this option is not respected in the resulting build. Instead, the hands are shown using the gizmo models.

To reproduce

Steps to reproduce the behavior:

  1. Install MRTK with the OpenXR plugin, run SteamVR.
  2. Connect an HTC Vive (with Vive controllers),
  3. Select a Global hand visualizer for MRTK in Input > Controllers > Global Left/Right Visualizer (or leave the default RiggedHandLeft/Right). In Articulated Hand tracking, keep Hand Mesh Visualization Modes to "Everything".
  4. Click play: in play mode the hands shown are the one selected in the MRTK options (see picture 1), and correctly track the Vive controllers.
  5. Click build and run, build for SteamVR (PC Standalone, Windows, x86_64)
  6. Run the build: the hand models are not using the RiggedHand model but the gizmo models (see picture 2)

Changing "Use Platform Models"

Expected behavior

The hand visualizer option should be respected in the build: in the previous example the hands should be shown using the RiggedHandVisualizer.

Screenshots

  1. In play mode, the hand visualizer is used correctly

PlayMode

  1. When running a build, the hand visualizer specified in the settings is not used

BuildMode

Your setup (please complete the following information)

Target platform (please complete the following information)

Additional context

Additionally, I think the callback IMixedRealiryHandJointHandler.OnHandJointsUpdated is never called in the build. I am using this callback to transmit the hand joint data to a remote player: when the Vive is in play mode, the joint data is updated and sent to the remote player, when the Vive is running a build, the hand joint data is never sent (which makes me suspect that the callback is never called). I don't have a Vive at hand currently, but I can test this behavior more thoroughly next week if necessary.

polar-kev commented 2 years ago

Hey @vogr, when you're in playmode, are you using your motion controllers to control the hands or are you using the mouse and keyboard for input simulation?

Does your scenario require hand detection? If so, have you considered using a leap motion controller with your vive for articulated hand support?

vogr commented 2 years ago

Hi @polar-kev, in play mode in that case I was using the motion controllers (the simulated input also works).

We don't need hand detection for our scenario, or at least not when using Vive+Controllers; but we are trying to use the same OpenXR project for both Vive+Controllers and Hololens 2 with hand detection.

We are using the joint data from a local user to display the hands (tracked or from the controllers depending on the setup) to a remote user. This works when the Vive+Controllers is running in play mode but not in the build.

Maybe you could clarify one point that I'm not sure about: is it reasonable to expect "pretty" articulated hand models (locally) and hand joint data (to send to the remote) on the Vive when using controlers? This is what we get in play mode: the joint data helps us display the position of the hands and the gripping animation.

keveleigh commented 2 years ago

This is likely caused by the bug tracked by https://github.com/microsoft/MixedRealityToolkit-Unity/issues/9849. Basically, the OpenXR integration into Unity works by spinning up two InputDevices per hand/controller: one for providing the button/thumbstick/airtap/etc interactions and one for the hand joints. MRTK currently makes an incorrect assumption that only a hand can provide hand joints. Therefore, when we receive the hand joint InputDevice first, we spin up the hand joint visualization. When we receive the interactions InputDevice first, we spin up the controller visualization. I'm not specifically sure about the play mode vs build behavior you're seeing, but it could be that each environment tends towards providing the two InputDevices in a specific order for some reason... I've been making steps towards fixing this in MRTK 2.8, but it's not completed yet.

zxzkf1992 commented 2 years ago

Hi @vogr, I have a problem with htc vive controller in MRTK. I can see the hand model in play mode and track the controller correctly. But the button mapping of the controller is not correct. I want to press the corresponding button on the controller and display the default gestures in mrtk, but no matter what button I use, the hand model never uses any of the default gestures in mrtk, so I think maybe the buttons of the vive controller are not mapped to mrtk. I would like to ask if you have encountered this situation, or how to map the vive controller buttons to mrtk? (like trigger -> select , grip -> grab, and so on)

IssueSyncBot commented 9 months ago

We appreciate your feedback and thank you for reporting this issue.

Microsoft Mixed Reality Toolkit version 2 (MRTK2) is currently in limited support. This means that Microsoft is only fixing high priority security issues. Unfortunately, this issue does not meet the necessary priority and will be closed. If you strongly feel that this issue deserves more attention, please open a new issue and explain why it is important.

Microsoft recommends that all new HoloLens 2 Unity applications use MRTK3 instead of MRTK2.

Please note that MRTK3 was released in August 2023. It features an all new architecture for developing rich mixed reality experiences and has a minimum requirement of Unity 2021.3 LTS. For more information about MRTK3, please visithttps://www.mixedrealitytoolkit.org.

Thank you for your continued support of the Mixed Reality Toolkit!