UCL-VR / ubiq

Other
98 stars 34 forks source link

Keyboard and Mouse Controller #54

Closed sebjf closed 3 weeks ago

sebjf commented 2 months ago

Some thoughts about reintroducing the Desktop (keyboard & mouse) controller, which has been on the list since the XRIT upgrade...

Goals

Scope

Comparison with Device Simulator

The Component is not a replacement for the XR Device Simulator. Whereas the Device Simulator can emulate all typical XR Controller Inputs, the Desktop interface will be more limited. It will only support the minimum input required to engage the Interactables mentioned above (that is, trigger the Hover, Select and Activate events).

It is expected applications intending to support the Desktop interface are implicitly developed with this in mind, by sticking to Interactable setups that are amenable to being driven with a keyboard and mouse, even if the Desktop controller means no active development effort or additional control flow needs to be configured.

Other applications will still need the XR Device Simulator to emulate more complex interactions, such as multi-interactor behaviour.

Proposed User Story

If an XR device is active, it overrules the Desktop interface. However, if an XR Device is connected but inactive, the Desktop interface should function automatically without any configuration change (1).

Users interact from a first person perspective. Using WSAD will translate the viewpoint. Holding the Right Mouse Button while not over an Interactable will allow rotating the Camera, otherwise the cursor behaves normally (2).

Users use the mouse both to look around and Select or Grab items. Items are activated with the Left Mouse Button, and selected with the Right Mouse Button (CTRL + Mouse Button on Mac). Moving the cursor over an item should activate Hover. Ideally, items should respond to Hover themselves (e.g. with an XRInteractableAffordanceStateProvider), but we will also colour the reticle so that it responds to to Interactable objects, in the way the current Ray Interactor does.

When the Right Mouse button is activated over an Interactable, it is locked and will not change the orientation until released.

If an Interactable is selected, like below, we will show a spline between the cursor and the focal point, to emulate static or heavy items.

image

Users can interact with world-space UI Canvas elements by clicking normally, and text boxes by typing.

(1) This is so users can use the Desktop interface to prototype their scenes with headsets connected, instead of having to keep putting on and taking off a headset. (2) The reason for masking the orientation change is so users can use the cursor to more easily interact with on screen keyboards and other UI elements.

An alternative to clutching, is that the cursor by default rotates the camera, but stops moving when over a UX element. This might be quite nice in XR focused apps, but could also become odd if the cursor ends up far away from the center of the screen.

Implementation Options

Almost every interaction can be achieved with the XR Device Simulator's Left Controller and Grab or Trigger button emulation.

One possibility then is to create our own InputDevice subclasses and hook into the InputSystem in the same way the device simulator does. One problem with this is that the behaviour when having multiple XR devices added appears to be undefined. For example, we can easily take control of the hand from the device simulator with an XR controller, but not vice versa.

Another possibility is to manually trigger the Input Action References on the XR Controller scripts. This will provide more control and also mean we can repurpose the existing hands.

Another possibility is to introduce a new GameObject in parallel to the hands, which maintains its own Interactor components, set up specifically for the desktop case. The advantage of this is that it would probably be easier to set up dedicated visuals and configurations more suitable for the Desktop interfaces. We would have to worry about fighting over control of the Hand Controllers, because the XR Input Device can track the hands, and the Desktop component can have its own GameObject to deal with. For locomotion, the Desktop interface would translate the XR rig, and orient the head using the Transforms.

sebjf commented 3 weeks ago

In latest release: https://ubiq.online/blog/new-desktop-control-scheme/. Will integrate any feedback under another ticket.