Neos-Metaverse / NeosPublic

A public issue/wiki only repository for the NeosVR project
195 stars 9 forks source link

Allow eye tracking to be used for lasers #1942

Open InsaneGrox opened 3 years ago

InsaneGrox commented 3 years ago

Is your feature request related to a problem? Please describe.

Not really... more of something I thought would be neat...

Relevant issues

I didn't find any... infact I think I may be the first to think of this idea...

Describe the solution you'd like

Add a setting for the vive pro eye eye tracking to be used as a cursor simular to the desktop mode mouse cusor... to interact with menus and stuff... and possibly even be used for tooltips.

Describe alternatives you've considered

Again... not really?

Additional context

I just got a vive pro eye and re-parented my tooltip anchor on my right hand to my right eye and had a lot of fun with it... the eye tracking is definitely accurate enough that this would be usable... and could be useful if I forget my controllers or don't want to move my hands to do things...

shiftyscales commented 3 years ago

I don't think this will be particularly practical, or useful until such point as Neos becomes more friendly for controller-free use, e.g. issues made by yourself including #1456 / #1457 being added.

This doesn't really make much sense to add in isolation, in particular because as you've already noted, for just goofing around, it can already be done.

Doing this properly would require developing an input scheme to support this and to provide access to the actions required to make it usable (e.g. primary, secondary, grab, etc.)

This could also potentially be developed by alternative means as well.

As an example it could be done in a more generic way by using arbitrary input data via Logix to control the input system and activate primary, secondary, and other actions.

This would allow users to use any arbitrary data as an input source to control basic input binding actions, e.g. say a user were to have a facial tracker as well in combination.

This could enable a user to develop a control scheme using arbitrary data generated in-app (and by extension, potentially websockets) instead of relying solely on supported hardware/devices/input schemes, e.g. keyboard/mouse/gamepad/VR controllers. (This in of itself would require some security considerations when deciding how to designate what arbitrary data streams can be used for inputs/preventing them from being tampered with/modified.)