1j01 / tracky-mouse

Mouse control via head tracking, as a cross platform desktop app and JS library. eViacam alternative.
https://trackymouse.js.org/
MIT License
27 stars 5 forks source link

Smiling shouldn't move the cursor #34

Open 1j01 opened 2 years ago

1j01 commented 2 years ago

Currently facial pose affects the cursor position. This will be a problem for facial gestures (#25) if every time you try to click it moves the cursor.

First of all, this could be solved by using the face pose directly, if it were fast and accurate enough. This would also allow rotation to move the cursor more naturally, as right now, using points tracked on the surface of your face as projected in the 2D camera image, movement slows and stops as the point reaches the edge of your projected face, or in other words as the tangent of your face's surface where the point is tracked becomes parallel to the camera. In other other words, the movement of the theoretical 3D point that it's tracking becomes parallel to the camera as you rotate your head 90 degrees, and this parallel movement (depth) is not tracked.

Secondly, this could be solved by tracking a point on the head unaffected by facial gestures. Faces are, however, incredibly expressive! I see only a few candidates:

Oh, I just thought of a third option. Like the first, use the head rotation as tracked by facemesh, but since it's slow, use it only for movement not otherwise handled by the tracking points. Calculate the movement to be done by rotation and subtract the movement to theoretically already done by translation by looking at either the tracking points's movement since the last facemesh update, or some projected points (based on the last facemesh result and the new one) — not sure which. This is probably a terrible idea though. It technically provides the best of both worlds, but in a likely confusing, unpleasant mixture.