Closed machenmusik closed 9 months ago
Note it seems that there is this very feature being introduced to XRTK https://github.com/XRTK/XRTK-Core/pull/552
I'm working on making custom interaction mappings and custom controller / hand definitions more possible. I'll work on getting my proposal fully written up and proposed here.
@provencher I'm not sure I see what in that PR would enable this feature request. It looks like they're adding the same (well, a similar, with one additional) set of explicitly defined interactions in the MixedRealityHandController
class as we defined a while back in MRTK.
@keveleigh yes, for the interaction parts you're correct, but there are a few useful elements in that PR that relate to this:
HandDataPostProcessor, which does a post process on hand joints, independent of the platform, to provide things like grip and pinch strength, which can be leveraged as interaction mappings.
HandPoseRecognizer allows you to leverage defined poses in the editor, to map to interactions. This part is less concretely utilized, and it remains to be seen how well this concept works in of itself.
does a post process on hand joints, independent of the platform, to provide things like grip and pinch strength, which can be leveraged as interaction mappings.
While we don't currently support a grip / pinch strength interaction, we do have something similar which is currently utilized in our Leap Motion support to determine grab / pinch (but is platform-independent):
WMR articulated hands use this class as a base as well. The intent behind this definition is definitely for it to grow as more features are needed. It was the first step along the controller mapping rework I've been referring to, refactoring out the interaction definition into a single class so all classes that choose to have "articulated hand interactions" behave as similarly as possible and don't need to re-define identical interactions in each class. It will eventually also allow you to mark controller classes as the same physical controller via a definition class instead of via the SupportedControllerTypes
enum, which is how it's handled to day and isn't extendable.
allows you to leverage defined poses in the editor, to map to interactions. This part is less concretely utilized, and it remains to be seen how well this concept works in of itself.
This one is interesting! It sounds like it uses some version of our ArticulatedHandPoses to match runtime poses against.
Super happy we're having these conversations, as there's plenty of room to grow our hand support!
On the note of the IsPinching code, I think there is value in generalizing the logic in that getter to do a few things:
You could leverage the same logic with say the palm joint, to determine if any given finger is closed, and from there, get an IsGripping pose. I've gotten feedback on MRTK Quest that folks want to easily grab objects by closing their hands, like they can on HL2, which isn't supported out of the box on Quest.
Glad these conversations are allowing the capabilities of Articulated hands to grow!
hi folks, take a look at https://github.com/provencher/MRTK-Quest/pull/52 and feel free to share your thoughts
The question of what do to with incomplete information has come up there, and @keveleigh while you are considering refactoring, I thought it may also be valuable to consider detecting, and behaving differently in response to, temporary tracking interruptions (flickers/blips) vs. longer-duration outages.
@provencher has merged https://github.com/provencher/MRTK-Quest/pull/52 - however the change is not Quest specific; @keveleigh it might be good to provide for all ArticulatedHand
This issue has been marked as stale by an automated process because it has not had any recent activity. It will be automatically closed in 30 days if no further activity occurs. If this is still an issue please add a new comment with more recent details and repro steps.
I dont believe this issue is resolved - I'd suggest not closing it yet.
This issue has been marked as stale by an automated process because it has not had any recent activity. It will be automatically closed in 30 days if no further activity occurs. If this is still an issue please add a new comment with more recent details and repro steps.
We appreciate your feedback and thank you for reporting this issue.
Microsoft Mixed Reality Toolkit version 2 (MRTK2) is currently in limited support. This means that Microsoft is only fixing high priority security issues. Unfortunately, this issue does not meet the necessary priority and will be closed. If you strongly feel that this issue deserves more attention, please open a new issue and explain why it is important.
Microsoft recommends that all new HoloLens 2 Unity applications use MRTK3 instead of MRTK2.
Please note that MRTK3 was released in August 2023. It features an all-new architecture for developing rich mixed reality experiences and has a minimum requirement of Unity 2021.3 LTS. For more information about MRTK3, please visit https://www.mixedrealitytoolkit.org.
Thank you for your continued support of the Mixed Reality Toolkit!
Describe the bug
Articulated hands do not allow custom interaction mappings
To reproduce
To reproduce the behavior, notice that even if you define your own input system and controller mapping, you can't add new entries to articulated hands.
Expected behavior
If you go to the trouble to define custom mappings, that ability should be supported.
Note that this is increasingly painful if one wants to build upon articulated hands, since the lack of custom interaction mapping on the base class, and its enforced merging of behaviors as a result, forces creation of a new input source type in MRTK.
Screenshots
Your setup (please complete the following information)
Target platform (please complete the following information)
Additional context
Again, thanks for the excellent discussion in HoloDevelopers Slack https://holodevelopers.slack.com/archives/CTW7K59U4/p1590468076355100?thread_ts=1590361318.325600&cid=CTW7K59U4