Is your feature request related to a problem? Please describe.
While Hubs provides a number of interaction paradigms dependent on device type; we should have a mode that we can distill down core interactions and navigation to a binary input control with varying input frequencies (short/long/continuous).
Consider as part of longer-term sprint on accessibility. We need to fully understand how we surface customization in general, as well as accessibility of the system.
Is your feature request related to a problem? Please describe. While Hubs provides a number of interaction paradigms dependent on device type; we should have a mode that we can distill down core interactions and navigation to a binary input control with varying input frequencies (short/long/continuous).
I started thinking about this as a result of this work: https://blog.prototypr.io/accessible-locomotion-and-interaction-in-webxr-e4d87c512e51
More detail to come.
┆Issue is synchronized with this Jira Task