GodotVR / godot_oculus_mobile

Godot Oculus mobile drivers (Oculus Go / Oculus Quest)
MIT License
169 stars 34 forks source link

Add use of pointer_pose into the demo #119

Closed goatchurchprime closed 3 years ago

goatchurchprime commented 3 years ago

I was almost at the point of hacking in some pointer based on hand tracking, with filtering to take account of the noise in the position of the hand.

But then it seems there is already this function in the API: https://developer.oculus.com/documentation/native/android/mobile-hand-tracking/

PointerPose

Deriving a stable pointing direction from a tracked hand is a non-trivial task involving filtering, gesture detection, and other factors. The hand tracking API provides the PointerPose field on the ovrInputStateHand structure so that pointing interactions can be consistent across apps.

The PointerPose field is an ovrPosef that indicates the starting point and position of the pointing ray in world space. It is recommended that developers use this field to determine the direction the user is pointing for the case of UI interactions.

And then it seems that it's already been pulled into the oculus mobile here: https://github.com/GodotVR/godot_oculus_mobile/pull/83

Use of this function should be included in the demo examples, because that's as good as what we use for documentation of anything. Otherwise people are going to not know it's there and waste time implementing their own version, as I almost did.

goatchurchprime commented 3 years ago

I know this is not as useful as a finished pull-request to the library, but I am not confident enough to make one.

Here is my development at working out the pointer_pose function, as well as the fade-out and fade-in feature you get on the hand-tracking when the confidence level drops and you are in the default oculus system scene.

I found that it is best to put the hand model and pose laser pointer into a Spatial node that is a sibling of the ARVRController instead of as a child of the controller node. This is for two reasons:

First, the laser pointer transform is given relative to the ARVROrigin, not the ARVRController, because it appears really to work as 3DOF where it is close to the vector from the ARVRCamera to the CofG of the ARVRController/hand so that it does not deflect when you make, for example, a pinch gesture.

Second, during the fade-out when the confidence level drops you don't want the hand to continue moving, so being able to stop copying transform over from the ARVRController node to its sibling instead of trying to suppress the parent transform is a lot smoother to implement.

It turns out you have to have the confidence level high if you are using the hands for controlling a UI rather than playing a rubber hand game because otherwise you get a lot of mis-clicks and actions made that the player did not intend. It is much worse to click on a button by mistake than to fail to click on a button that you intended to. By disappearing the hands whenever the confidence drops (most commonly when they occlude one another) the player can very quickly be trained to keep their hands apart when doing work, which really improves the experience.

HandScript.zip