Closed vogr closed 9 months ago
Hey @vogr, when you're in playmode, are you using your motion controllers to control the hands or are you using the mouse and keyboard for input simulation?
Does your scenario require hand detection? If so, have you considered using a leap motion controller with your vive for articulated hand support?
Hi @polar-kev, in play mode in that case I was using the motion controllers (the simulated input also works).
We don't need hand detection for our scenario, or at least not when using Vive+Controllers; but we are trying to use the same OpenXR project for both Vive+Controllers and Hololens 2 with hand detection.
We are using the joint data from a local user to display the hands (tracked or from the controllers depending on the setup) to a remote user. This works when the Vive+Controllers is running in play mode but not in the build.
Maybe you could clarify one point that I'm not sure about: is it reasonable to expect "pretty" articulated hand models (locally) and hand joint data (to send to the remote) on the Vive when using controlers? This is what we get in play mode: the joint data helps us display the position of the hands and the gripping animation.
This is likely caused by the bug tracked by https://github.com/microsoft/MixedRealityToolkit-Unity/issues/9849.
Basically, the OpenXR integration into Unity works by spinning up two InputDevice
s per hand/controller: one for providing the button/thumbstick/airtap/etc interactions and one for the hand joints.
MRTK currently makes an incorrect assumption that only a hand can provide hand joints. Therefore, when we receive the hand joint InputDevice
first, we spin up the hand joint visualization. When we receive the interactions InputDevice
first, we spin up the controller visualization.
I'm not specifically sure about the play mode vs build behavior you're seeing, but it could be that each environment tends towards providing the two InputDevice
s in a specific order for some reason...
I've been making steps towards fixing this in MRTK 2.8, but it's not completed yet.
Hi @vogr, I have a problem with htc vive controller in MRTK. I can see the hand model in play mode and track the controller correctly. But the button mapping of the controller is not correct. I want to press the corresponding button on the controller and display the default gestures in mrtk, but no matter what button I use, the hand model never uses any of the default gestures in mrtk, so I think maybe the buttons of the vive controller are not mapped to mrtk. I would like to ask if you have encountered this situation, or how to map the vive controller buttons to mrtk? (like trigger -> select , grip -> grab, and so on)
We appreciate your feedback and thank you for reporting this issue.
Microsoft Mixed Reality Toolkit version 2 (MRTK2) is currently in limited support. This means that Microsoft is only fixing high priority security issues. Unfortunately, this issue does not meet the necessary priority and will be closed. If you strongly feel that this issue deserves more attention, please open a new issue and explain why it is important.
Microsoft recommends that all new HoloLens 2 Unity applications use MRTK3 instead of MRTK2.
Please note that MRTK3 was released in August 2023. It features an all new architecture for developing rich mixed reality experiences and has a minimum requirement of Unity 2021.3 LTS. For more information about MRTK3, please visithttps://www.mixedrealitytoolkit.org.
Thank you for your continued support of the Mixed Reality Toolkit!
Describe the bug
When using MRTK with an HTC Vive (SteamVR) using the OpenXR plugin, the hand visualizer used in play mode is the one specified in the MRTK options, but this option is not respected in the resulting build. Instead, the hands are shown using the gizmo models.
To reproduce
Steps to reproduce the behavior:
Changing "Use Platform Models"
Expected behavior
The hand visualizer option should be respected in the build: in the previous example the hands should be shown using the RiggedHandVisualizer.
Screenshots
Your setup (please complete the following information)
Target platform (please complete the following information)
Additional context
Additionally, I think the callback IMixedRealiryHandJointHandler.OnHandJointsUpdated is never called in the build. I am using this callback to transmit the hand joint data to a remote player: when the Vive is in play mode, the joint data is updated and sent to the remote player, when the Vive is running a build, the hand joint data is never sent (which makes me suspect that the callback is never called). I don't have a Vive at hand currently, but I can test this behavior more thoroughly next week if necessary.