immersive-web / webxr

Repository for the WebXR Device API Specification.
https://immersive-web.github.io/webxr/
Other
2.99k stars 384 forks source link

Explainer needed for: Input Devices attached to VRDisplays #176

Closed bfgeek closed 6 years ago

bfgeek commented 7 years ago

We briefly chatted about this at the Seattle F2F I thought I'd just capture the discussion (from what I remember here) and so we can build it into an explainer.

Ideally you want to have inputDevices attached on the vrDisplay, e.g.

const vrDisplay = ...;

vrDisplay.inputDevices.length; // 2 perhaps if there are two hand controllers.

It'd be nice to have a proper class hierarchy for all the different types of input devices, e.g. (I'm just going to use IDL quickly to show this).

interface InputDevice { };

interface VRHandInputDevice : InputDevice { };
VRHandInputDevice implements VRThingThatHasPosition;

interface VoiceInputDevice : InputDevice { };

each of these would have specific API which is appropriate for that device, e.g. the voice device wouldn't have "buttons".

This could work with the gamepad API, e.g.

const gamepad = ...;
const device = new GamepadInputDevice(gamepad);

vrDisplay.inputDevices.append(device);

Depending on what is on the input device this would mean that you could hook up an "action" event which a vrDisplay could use internally. E.g.

// inside vrDisplay.native.cpp
inputDevice.addEventListener('action', (e) => {
  // do something native to the UAs experience.
});

Developers could also provide their own input devices based off the Bluetooth or WebUSB spec, e.g.

class MyArduinoInputDevice extends InputDevice {
  onSomeBluetoothEvent() {
    this.dispatchEvent('action');
  }
};

cc/ @esprehn

The proposed API should be written up in an explainer first. E.g.

https://github.com/w3c/ServiceWorker/blob/master/explainer.md https://github.com/WICG/IntersectionObserver/blob/gh-pages/explainer.md https://github.com/w3c/css-houdini-drafts/blob/master/css-layout-api/EXPLAINER.md

daoshengmu commented 7 years ago

Adding an example for showing how to use this command API for updating the pose of vr input devices in JS.

var vrControllerModel;  // VR controller model for displaying in the 3d space with WebGL.

for (var i = 0; i < vrDisplay.inputDevices.length; ++i) {
  var inputDevice = vrDisplay.inputDevices[i];
  inputDevice.addEventListener('posemove', (e) => {
    var vrController = e.target;
    var controllerMat;

    if (vrController.pose) {
      getPoseMatrix(controllerMat, vrController.pose);
      vrControllerModel.drawWithMatrix(controllerMat);
    }
  });
}
huningxin commented 7 years ago

It sounds like a good idea to expose various VR input devices, even including the emerging hand tracking and gesture recognition for some VR/MR HMD.

Adding an example of using gaze and click gesture to select an object in WebVR.

var raycaster; // Gaze raycaster by head position and orientation.

for (var i = 0; i < vrDisplay.inputDevices.length; ++i) {
  var inputDevice = vrDisplay.inputDevices[i];
  if (inputDevice instanceof VRHandTrackingDevice) {
    inputDevice.addEventListener('click', (e) => {
      var element = raycaster.intersectedEls[0];
      element.emit('select');
    });
  }
}
huningxin commented 7 years ago

According to WebVR Origin Trial results so far (Chrome 57),

A fair amount of developer feedback centered around difficulty of using the gamepad API for input. A dedicated VR input API is being considered as a result.

Should we scope VRInputDevice for 2.0? @toji @cvan @NellWaliczek , thoughts?

toji commented 6 years ago

This is resolved by the new input system for WebXR. closing.