GodotVR / godot-xr-tools

Support scenes for AR and VR in Godot
MIT License
471 stars 64 forks source link

Feature Request - Hand-tracking and hand-pointer interface as functional as the original Quest interface #124

Open goatchurchprime opened 2 years ago

goatchurchprime commented 2 years ago

Hand-tracking is currently very tightly bound to a particular valve hand model in the openxr library which maps across to a skeleton shape without any easy access to the transforms. Once we can intercept these transforms we can do gesture detection, networking and hand fading.

Previously in the Oculus Quest the interface could be run with following code in the _process() function:

const hand_bone_mappings = [0, 23,  1, 2, 3, 4,  6, 7, 8,  10, 11, 12,  14, 15, 16, 18, 19, 20, 21];
var bone_orientations = [ ]  # Array of Quats
bone_orientations.resize(24)

var tracking_confidence = ovr_hand_tracking.get_hand_pose(controller_id, bone_orientations)
for i in range(hand_bone_mappings.size()):
    handskeleton.set_bone_pose(hand_bone_mappings[i], Transform(hand_boneorientations[i]))

There was also a hand-tracking pointer that operated from within the ARVR origin space because it was based some combination of the direction of your hand and your arm to keep it stable. (The direction of a single finger is too noisy to use.)

var pointervalid = (handstate == HS_HAND) and ovr_hand_tracking.is_pointer_pose_valid(controller_id)
var pointerposearvrorigin = ovr_hand_tracking.get_pointer_pose(controller_id)

Oh, and also the Quest operating system had a couple other functions, like

handcontroller.is_button_pressed(HT_PINCH_MIDDLE_FINGER)
handcontroller.is_button_pressed(HT_PINCH_INDEX_FINGER)

I think these also appeared as the get_joystick_axis(finger_number) so you could tell how close the pinch was to make that pleasing eye-dropper effect in the home area.

It looks as though get_pointer_pose() and get_joystick_axis(finger_number) are missing from the OpenXR standard, in which case we should work to recreate them in this plugin, because there will be a lot of delicate fine tuning, debouncing and filtering to get back to a good UI-controlling experience that we could all use.

BastiaanOlij commented 2 years ago

Indeed OpenXR does not support any of these, the idea being that gestures are recognized by the XR runtime and exposed as different inputs that can be mapped to actions.

Meta went down a different route and introduced their own API to reintroduce the logic from VrAPI, m4gr3d implemented that a little while ago. When hand tracking is enabled you can add two new ARVRController nodes and set the controller ids to 3 and 4. these now expose the pinch values and things through various axis. Have a look at his PRs for details, I think he also added some logic to the demo.

Finally, the skeleton works fine on Quest and any other platform, the problem is that the OpenXR spec isn't strict enough so the skeletons of the different platforms do not match. A mesh designed for SteamVR won't work on Oculus and vise versa but the skeleton is there, and you can just read out the transforms from the bone poses, or use a BoneAttachment node.

BastiaanOlij commented 10 months ago

We're still waiting on OpenXR to finalise things though there are a number of extensions added that improve things, and we have raw access to the information coming soon.

The problem we're still having is the difference in information between platforms (Valve, Meta, Pico, etc) and needing some sort of mapping system, and that WebXR implements this differently as well.

Since this was raised we did add our own hand solution which is a good compromise that ensures platform compatibility.

To be continued...