Yellow-Dog-Man / Resonite-Issues

Issue repository for Resonite.
https://resonite.com
134 stars 2 forks source link

New Component: CanvasCursor #1497

Open 5H4D0W-X opened 6 months ago

5H4D0W-X commented 6 months ago

Is your feature request related to a problem? Please describe.

UIX can only be interacted with using touch sources or lasers. This makes it unsuitable for other interaction methods, such as Eye tracking or touchpad control (index knuckles) without heavy avatar modification or custom interaction setups.

Describe the solution you'd like

A new component, CanvasCursor, that takes several inputs:

And a method that can be called using ProtoFlux:

The PositioningMode enum sets how the contents of the position field are interpreted:

The ClickHold boolean input acts as a trigger that can be active for any duration of time, similar to the interaction laser. This can be used to operate any UIX element.

The DoClick method works the same way as the ClickHold boolean, but can be used as an alternative for single clicks that might be more reliable than toggling a bool on for a single frame.

While the component is disabled, the cursor does not influence the UI at all. While it is enabled, the cursor acts like a laser, triggering hover events on elements it's overlapping with.

Describe alternatives you've considered

As mentioned previously, interacting with UIX in novel ways would require heavy avatar modifications, such as the ability to reparent the interaction laser. Most of these methods would also be reliant on raycasts, which can be unreliable at extreme scales or distances from root.

Additional Context

No response

Requesters

ShadowX

shiftyscales commented 6 months ago

This would probably be something better implemented globally across across the entire engine- using eye tracking/gaze as a cursor source for use with hand-tracking.

5H4D0W-X commented 6 months ago

The idea here is not specifically to support eye tracking, it's far more general than that. I would like to be able to make UI that can be interacted with by any method of input, such as gaze tracking, touchpad input on controllers or custom devices, foot pedals, or virtual controls such as a joystick or mouse. I would love to see native eye tracking navigation though