immersive-web / webxr-gamepads-module

Repository for the WebXR Gamepads Module
https://immersive-web.github.io/webxr-gamepads-module
Other
30 stars 10 forks source link

Should hand-based input sources have a Gamepad? #23

Closed Manishearth closed 2 years ago

Manishearth commented 4 years ago

See also: https://github.com/immersive-web/webxr-input-profiles/issues/105

AR devices may decide to expose hands as input sources. These can expose some kind of pinch/tap select event, and potentially a squeeze event.

These aren't technically gamepads, and it feels a bit weird to expose a Gamepad object for them. The information such an object would expose can already be calculated via event listeners. On the other hand, apps written to consume Gamepad objects instead of the events would not work on AR devices with hand inputs.

Should we be requiring that hand inputs have a Gamepad? Should this be a matter of UA choice? Or should we say that only actual gamepads (with buttons and stuff) should use Gamepad?

NellWaliczek commented 4 years ago

To add a bit more context... it's not just AR that will do this. Oculus has already stated they will expose the ability to track hands. And devices like the Valve Index have the ability to report gamepad data and hand skeletons.

Manishearth commented 4 years ago

The hope for these is that in the future we'll have an inputSources.hand API that provides articulated hand input, and apps written for hand-based devices would be encouraged to use that if they want a view of the hands other than the two main events.

As Nell said, this would be useful for non-AR as well; including non-AR gamepad-like inputs which have the ability to infer hand position.

When discussing this with @thetuvix he pointed out the symmetry here: non-hand gamepad inputs may wish to choose to view the input as a hand in certain scenarios (e.g. rendering a hand model), and it may be worth exposing the reverse option to allow non-gamepad hand inputs to view the input as a gamepad.

Manishearth commented 4 years ago

/agenda to be discussed alongside https://github.com/immersive-web/webxr-input-profiles/issues/105

Artyom17 commented 4 years ago

Oculus Quest hands tracking initially will have pretty much one gesture that can be used as a gamepad input - "pinch" i.e. "select". It can be mapped to a button with 0 or 1 states. The second one will be used as "exit vr button" and wouldn't be available for gamepad button mapping.

Manishearth commented 4 years ago

does anyone remember the resolution here?

Manishearth commented 4 years ago

/agenda to rediscuss if necessary

Manishearth commented 4 years ago

related: https://github.com/immersive-web/webxr/issues/997

Manishearth commented 4 years ago

https://www.w3.org/2019/12/03-immersive-web-minutes.html

seemed like we wanted to add gamepads, but we need to resolve the profiles issue first

Artyom17 commented 4 years ago

I don't have objections against exposing gamepad object for hands.

Manishearth commented 4 years ago

No clear conclusion on this today. There are a couple paths forward:

If we add generic-hand input profiles we might be able to tie a lot of these things to those input profiles.

It would be somewhat nice if we could not use gamepads and instead have hands dealt with holistically, and have authors write their code accordingly. However, this depends on how much application code in the wild starts depending on the gamepad object for select/squeeze. Babylon already had this issue, though they fixed it.

Manishearth commented 4 years ago

My understanding is that Oculus exposes a gamepad for hands now

toji commented 2 years ago

Updating this with the latest information that I have:

It's worth noting that one thing that was changed from when this conversation started is that hand input requires a feature descriptor to be given at session creation time. This means that most developers that will receive hand inputs will have explicitly requested them, and shouldn't be surprised to see inputs with no explicit gamepad.

However, given that this is already the behavior of the most widespread platform with hand input and it naturally can extend to support the use case mentioned by Nell with devices like the Index controllers generating both (emulated) hand poses and gamepad inputs, I think that we definitely shouldn't disallow gamepads with hand inputs. That leaves either the options of mandating gamepad support or leaving it as a UA choice.

In this case I'd advocate for leaving it as a UA choice. We could mandate it, but we wouldn't want to specify the exact gamepad layout as that could restrict interesting future use cases. And If we're not specifying a layout then implementors that feel that gamepad input isn't necessary may simply end up adding a device with no buttons and no axes to satisfy an arbitrary requirement, and that puts developers in a worse situation than before.

Again, the Index controller comes to mind as a potential use case, but also my understanding is that the upcoming Lynx-R1 relies on it's hand tracking capabilities to drive it's controllers. It's not clear to me what the most natural way for that platform to expose it's inputs will be, but they've said publicly that they intend to have a WebXR-capable browser for the device, so I'd like to allow them the flexibility to experiment here.

So my recommendation would be that we leave the exact behavior of gamepads for hand inputs up to the UA, which functionally will likely mean that they almost all expose one anyway for compatibility sake, but offers flexibility as input methods evolve.