mrdoob / three.js

JavaScript 3D Library.
https://threejs.org/
MIT License
100.38k stars 35.2k forks source link

ImmersiveControls #21039

Open felixmariotto opened 3 years ago

felixmariotto commented 3 years ago

Making a good immersive VR controls module is long and complex, because of the many basic cases to cover :

This is an issue because :

I found myself creating a "VRControls" module on my first VR project, and re-used it ever since, improving it now and then... and I'm sure everyone using three.js for VR do the same thing ( not suggesting to use mine, it's unsuitable ). I think it would be beneficial for everybody if we had a community tool for this, a new controls module in the exemples.

Example API :

import { ImmersiveControls } from 'three/examples/jsm/controls/ImmersiveControls.js';

const controls = new ImmersiveControls( camera, domElement );

controls.addEventListener( 'click', (e) => {

    e.origin; // 'mouse', 'right-controller', 'left-hand'...

    // cast a ray either from the camera or the controller(s)
    const intersects = controls.intersectObjects( scene.children );

});

controls.addEventListener( 'move', (e) => {

        // intersect bounding sphere(s) at controller(s) or hand joint(s) position
    if ( controls.intersectsPlane( plane ) ) {

        console.log( `then ${ e.origin } is intersecting the plane` )

    }

});

See how life would be easier : the three.js user would not have to care about what is the current state of immersion. They would just listen to standard events and do intersection tests from a higher-level API. It would make for a more standard and healthy code.

Of course this would not fit every case of every VR project, but most of the time it would be enough, and it would definitely help people hit the ground running.

If there is interest for this I'm willing to work on a PR ( or to let somebody more capable do it 😉 ) after discussing the scope and API.

gkjohnson commented 3 years ago

I very much support adding something like this to the examples. I just started doing some WebXR dev and it would be great if this could cut down on the amount of boilerplate users have to write in the way that classes like OrbitControls and FlyControls have for mouse and touch apps.

Making it simpler to create and instantiate controllers like you've suggested would be great. I'm not sure if these are within the scope of what you're imagining but here are a few common VR paradigms I was thinking about for some of the demos I put together that would be nice to be able to easily integrate:

felixmariotto commented 3 years ago

I'm glad you like the idea !

All the points you mentioned are relevant, and there is plenty of native VR games and experiments in which we can pick more examples of good ergonomic.

Teleport arc and / or ray movement (see performant oculus implementation description here).

This is a very interesting read, I whish we can integrate such a feature in ( the hypothetical ) ImmersiveControls in a way that make sense.

The more important I think is this idea of standard API, whatever the state of immersion. So it could look like this :

// to set with a value between 0 and 1, 1 giving a straight ray.
ImmersiveControls.pointerForce = 0.5;

// ImmersiveControls.update in the loop would draw the arc or the ray in the passed scene against the targets.
ImmersiveControls.castPointer( scene, scene.children );

controls.addEventListener( 'click', () => {

        // tells ImmersiveControls.update to stop rendering the pointer
    ImmersiveControls.removePointer();

});

Now there is a question about the origin of the pointer ( ray or arc ) : what if there is no controller nor hand ? then it should originate from the camera, but would it make sense ? Maybe in this case the pointer should be faded out near the camera, and/or originate from a point slightly on the side of the camera ? If so on what side ?

felixmariotto commented 3 years ago

I've made a basic live demo here based on the new octree example, which is a great playground for testing immersive locomotion.

I'm working on it on this branch of my fork.

At the moment it's pretty rough and it only supports setting a direction with the joystick(s) and the WASD/arrows keys. In the case of the example it's used for locomotion. At the moment the module usage looks like this :

// will remove this part to only pass the camera
playerSpace = new THREE.Group();
scene.add( playerSpace );
playerSpace.add( camera );

controls = new ImmersiveControls( playerSpace, renderer );

controls.addEventListener( 'keydown', (e) => {

    /*
    e: {
        type: 'keydown', // event name
        handedness: 'right', // from XRInputSource.handedness. null if keyboard
        inputProfile: 'controller' // can be 'keyboard' or 'hand'
    }
    */

});

controls.addEventListener( 'keyup', (e) => {
    // e: same as above
});

controls.addEventListener( 'directionchange', (e) => {
    // e: same as above
});

controls.direction // normalized Vector2 updated by WASD/arrows keys and controller(s) joystick(s)

Locomotion for hand-tracking is actually a tough question... From what I can see, the consensual solution is to only offer "teleport" locomotion with hand tracking, or there is this "telepath" idea which is interesting as well.

@mrdoob are you interested in this module for the lib, or is it out of scope and I should work on it in a separate repo ?

gkjohnson commented 3 years ago

I've made a basic live demo here based on the new octree example, which is a great playground for testing immersive locomotion.

Looks great! But joystick movement still makes me queasy... I've got to improve my tolerance!

Locomotion for hand-tracking is actually a tough question...

Creating a general class for gaze, hands, and controller movement and interaction seems like a difficult task. How common are gaze-only VR headsets right now, anyway? Where's WebXRStats.com when you need it? 😁 I suppose AR phone use cases might behave as a gaze cursor but I haven't tried it.

Regarding an API I was thinking something that was more scoped and composable like the OrbitControls, etc classes to start. Here's what I was imagining when I was considering writing a teleport example:

function init() {

    // ...

    const playspace = new Group();
    playspace.add( camera );

    const controller = renderer.xr.getController( 0 );

    // constructor takes controller, playspace to teleport, and the
    // scene to raycast against
    controls = new XRTeleportControls( controller, playspace, castScene );
    scene.add( controls.arc );

}

function render() {

    // ...

    controls.update();

    renderer.render( scene, camera );

}

Then down the line some controller instance that represents the last used controller or a mock controller driven by mouse and keyboard could be passed in. It might also be worth noting that until #21002 is dealt with the controller position used to raycast will be one frame delayed. And a bit related but there's PR #20790 that aimed to add move events to the XR controllers, as well.

From what I can see, the consensual solution is to only offer "teleport" locomotion with hand tracking, or there is this "telepath" idea which is interesting as well.

Do you mean teleport locomotion is only offered with controller tracking? I actually haven't used hand tracking with VR myself, yet, but locomotion does seem like a difficult problem. I haven't seen anyone tackle that yet outside of gazing and tapping a menu. Telepath looks interesting, though!

felixmariotto commented 3 years ago

Creating a general class for gaze, hands, and controller movement and interaction seems like a difficult task. How common are gaze-only VR headsets right now, anyway? Where's WebXRStats.com when you need it? 😁 I suppose AR phone use cases might behave as a gaze cursor but I haven't tried it.

Yea to be honest I never use gaze controls, I have to figure out how to do this with my Quest. I have a friend who got a Samsung 3DoF HMD, I will borrow him maybe for real life use case tests.

Regarding an API I was thinking something that was more scoped and composable like the OrbitControls, etc classes to start. Here's what I was imagining when I was considering writing a teleport example:

Then down the line some controller instance that represents the last used controller or a mock controller driven by mouse and keyboard could be passed in.

I'm all for it but the THREE.Group returned by renderer.xr.getController is very incomplete, to my knowledge it doesn't hold any reference to the controller's handedness, inputSource and most importantly the associated Gamepad.

At the moment ImmersiveControls is doing this :

let input = {
    inputGroup: renderer.xr.getController( 0 )
};

xrInputs.push( input );

input.inputGroup.addEventListener( 'connected', (e) => {

    // https://www.w3.org/TR/webxr/#xrinputsource-interface
    input.inputSource = e.data;

    input.gamepad = e.data.gamepad;

    input.hand = e.data.hand;

    input.handedness = e.data.handedness;

} );

It grabs the XRInputSource and the major information it needs for a rich support including gamepad button and joystick usage, transformation of controller in hand control, handedness...

So users would have to provide their own inputSource(s) on ImmersiveControls instantiation and it would look like this :

const controller = renderer.xr.getController(0);

let controls;

controller.addEventListener( 'connected', (e) => {

    controls = new ImmersiveControls( e.data, camera, renderer... );

} );

It's a bit complex, especially if you want to pass an array of inputSource instead of just one, and using a custom inputSource is a corner case in my opinion... So I'd rather offer it as an option like this :

const controls = new ImmersiveControls( camera, renderer... );

// optional :
controls.setInputSources( [ inputSource1, inputSource2 ] ); // array could have one elem only

To control the preffered controller, or if in your game the right and the left controllers don't serve the same purpose, you can easily filter with the handedness event property ( which is derived from XRInputSource.handedness ) :

let preffered;

controls.addEventListener( 'click', (e) => {

    if ( !e.handedness || e.handedness == preffered ) // do the preferred controller thing

    preffered = e.handedness;

});
gkjohnson commented 3 years ago

to my knowledge it doesn't hold any reference to the controller's handedness, inputSource and most importantly the associated Gamepad.

Oh I see you're right. In that case there was a question in issue #19418 about providing a general GamePad API wrapper which would be useful for this case, as well, but that didn't get a lot of follow up, either.

I suppose in the case of something like XRTeleportControls the gamepad API wouldn't necessarily be needed in the simple case because you could just use the squeeze event.

gkjohnson commented 3 years ago

Here's my take on a utility for this -- I dusted off some code from another project that wrapped the gamepad API to dispatch button presses as events and updated the API a bit so it's easier to use standalone. I also built a couple XR gamepad objects that can be individually instantiated and button and axis events are bubbled up through the XR controller surrogate which also automatically loads the controller model:

let controller;

function init() {

    // ...

    controller = new XRGamepad( renderer.xr, 0 );
    controller.addEventListener( 'pressed', () => { ... } );
    controller.addEventListener( 'axis-pressed', () => { ... } );

}

function render() {

    controller.update();
    renderer.render( scene, camera );

}

With this type of setup you can use the gamepad wrapper for both xr controllers as well as traditional game controllers.

The code for the gamepad wrapper is here. I also built a Teleport Controls class based on the Oculus "Tall Cast" approach here which just relies on the original xr select event but can use the joystick to rotate the scene in 45 degree increments, as well. Not sure if it matters but I'm using a Quest to test all this!

felixmariotto commented 3 years ago

I think you put the thumb right on the problem, we need an XRGamepad class to expose the XRInputSource object and offer a consistent "gamepad+XRInputSource" API over events and state.

The idea of a ready-made ImmersiveControls module on the model of OrbitControls is actually above this level of abstraction, it would be best if it used a hypothetical XRGamepad class internally. I realized this recently while making my experiments, because I had to make kind of an internal sub-class exactly for this purpose. I makes the controls module very fat and a bit out of scope.

This XRGamepad would be helpful for those who want to code their own controls. And if at some point three.js introduce several distinct XR controls modules, they will reuse the same "gamepad+XRInputSource" code, which is better. Don't you think we could just as well flesh out WebXRController and expose it in WebXRManager ? Instead of returning a Group, renderer.xr.getController would return this class, which would of course make the ray and grip spaces accessible.

I tried your demo, that's awesome ! I think the arc looks a bit flat though, although that's a matter of taste. I also implemented an arc in my demo on wednesday, on the model of Shadow Point, did you have a chance to try it ? Your method is better for targeting the platform above head level though, mine is making a bezier curve with a middle handle at a relative fixed point in front of the controller and a end point always bellow, so anything above controller-height + mid-handle-distance is unreachable.

I wish our efforts will lead to something that can be merged and used to simplify XR development with three.js.

gkjohnson commented 3 years ago

Don't you think we could just as well flesh out WebXRController and expose it in WebXRManager ?

Definitely but I'm not sure what @mrdoob considers the proper scope for gamepad button press events going in core. If there's interest I'm happy to contribute the gamepad event wrapper code as an example or into core for xr controllers and game development (which would address #19418, as well).

I tried your demo, that's awesome ! I think the arc looks a bit flat though, although that's a matter of taste. I also implemented an arc in my demo on wednesday, on the model of Shadow Point, did you have a chance to try it?

Thanks! I noticed the arcs feel a bit flat, too, but that seems to be the occulus model. I do like the feel of the arcs in your demo, though. I think there may be a mix of the two approaches that could yield more satisfying arcs as well as the ability to jump to tall platforms.

SrinivasPrabhu794 commented 3 years ago

Hey guys,

Is this official npm package for motion controllers of any use ? It was written by the same team which maintains the standards for WebXR.

Please take a look https://www.npmjs.com/package/@webxr-input-profiles/motion-controllers

gkjohnson commented 3 years ago

Is this official npm package for motion controllers of any use ? It was written by the same team which maintains the standards for WebXR.

This package is already directly included in the repository and used here for instantiating controller models and visualizing button state:

https://github.com/mrdoob/three.js/blob/dev/examples/jsm/webxr/XRControllerModelFactory.js

SrinivasPrabhu794 commented 3 years ago

How do we get to know the state of each button (say for each controller on oculus rift) for three.js ? Say thumbstick movement for example

felixmariotto commented 3 years ago

At the moment you can get the gamepad API like this :

const controller = renderer.xr.getController(0);

controller.addEventListener( 'connected', (e) => {

    controller.gamepad = e.data.gamepad

};

It's not ideal though, we really need this ImmersiveControls...

mrdoob commented 3 years ago

@felixmariotto Sorry for the delay.

It's hard for me to know how the design for this should be without doing experiments...

So that's what I'm planning to do for the next few weeks 🤓

DePasqualeOrg commented 2 years ago

I created these basic immersive controls for my VR projects with Three.js. You can try some examples here.

paulmasson commented 2 years ago

@mrdoob any recent thoughts on this issue?

DevPika commented 1 year ago

I have attempted to create a flexible XRInputEventsDispatcher class that uses the Motion Controllers library similar to XRControllerModelFactory. The discussion here was really helpful, more feedback welcome! Some considerations to the approach I have taken: