Open Odie opened 6 years ago
https://github.com/lfrazer/SKSE-VRInputPlugin
This makes use of PapyrusVR and introduces shim classes for the OpenVR api classes. This maybe opens the possibility to completely stop the game from receiving controller events.
Progress is slow but it's being made.
As of https://github.com/Odie/SKSE-VRInputPlugin/commit/73bf1b1d7d1dcabacf1ed28394075ddd5bf835d0, we can intercept button events and funnel them directly into scaleform/actionscript.
The next steps:
[x] VR input abstraction layer This will turn the raw data into something slightly higher level so it becomes easier to reason with.
[ ] VR input mapping layer This will provide mapping from abstracted button signals => requested action.
[ ] Input command dispatch This for the UI to actually dispatch the requested command to the game for processing.
Notes after experimenting with the vive controller...
Format : <flag-name>.<bit-num> => meaning text
press.1 => menu button pressed
press.2 => grip button pressed
pressed.32 => touchpad clicked
touched.32 => touchpad clicked. Touches do not actually flip this bit. =(
pressed.33 => trigger pulled beyond 0.25
touched.33 => trigger pulled beyond 0.25
axis.0 => touchpad
touchpad center is (0, 0)
touchpad right edge (1.0, 0)
touchpad left edge (-1.0, 0)
touchpad top edge (0, 1.0)
touchpad bottom edge (0, -1.0)
vec2 with magnitude [0, 1.0]
axis.1 = trigger [0, 1.0]
It would be great to be able to get better, more comprehensive, controller event information into SkyUI. At the moment, the UI mostly gets only up/down/right/left events, that's mapped to either thumbstick input or trackpad swipes. It feels like this can be greatly improved by either:
Letting UI have access to raw controller input
There are many buttons (and trackpads) on the VR controllers. If the UI can differentiate between each, at the very least, more distinct actions will be available to the user. Currently, we make do with a somewhat unconventional control scheme to deal with switching tabs and toggling column sort states.
It appears PapyrusVR is already quite able to read input events directly from OpenVR.
Pushing that data to scaleform may be fairly straightforward. It appears the scaleform API is exposed via SKSE. In particular, ScaleformMovie::Invoke() can be used to call actionscript functions. Relevant Scaleform Documentation can be found here.
Allow direct manipulation of the inventory list
If the UI had access to the controller coordinates, it might be possible to use the controller to directly manipulate inventory lists. This will likely make the UI much easier to use in VR than just getting access to more buttons. It's not clear how to get a hold of the rectangle of the UI in 3D space at the moment, so this is not yet feasible.
Lastly, both of these improvements depend on actually being able to stop the game from directly handling certain button events. For example, the "favorite" button appears hard coded to respond to the clicks on the right of the trackpad on either controller on the vive. We need a way to stop the game from directly handling this and let the UI decide when to issue such a command.