bulletmark / libinput-gestures

Actions gestures on your touchpad using libinput
3.93k stars 241 forks source link

backend for libinput-gestures #241

Closed natask closed 4 years ago

natask commented 4 years ago

I used to use libinput-gestures but libinput debug-events was too limited for me, particularly concerning tap gestures and 5-finger gestures. so I wrote a driver in python primarily for gestures built upon evtest (evemu is also possible). I also made some modifications so that the gestures weren't one-shot (much like fusuma).

It has well for me for the last few months.

I felt like sharing the script to others.Therefore, I was looking to make it more customizable by supporting config files from libinput-gestures and fusuma.
I realized then that it would be much better if I packaged the core of [gestures](https://github.com/natask/gestures) to avoid re-inventing the wheel and avoid fracturing this key feature.

Is this something that people are interested in? stuff after this line is the main meat of the backend. In the mean time, I will attempt to incorporate 5-finger gestures and tap gestures to libinput as PRs.

bulletmark commented 4 years ago

I'll may try that out if you improve the readme and say exactly what it does and how it is different to libinput-gestures. Sorry but I also don't see the point of raising an issue here though so closing.

natask commented 4 years ago

I raised an issue here to see if there was interest.

Key distinguishing features

5 finger gestures

This means that you can place 5 fingers on the touchpad and that is recognized as another class of gestures. This feature must supported by your driver + touchpad. check by running evtest /dev/input/$(cat /proc/bus/input/devices | grep -iA 5 'touchpad' |grep -oP 'event[0-9]+') | grep BTN_TOOL_QUINTTAP. if it prints a line containing "BTN_TOOL_QUINTTAP", your touchpad+driver support it.

tap gestures.

This means that tapping the touchpad with 4 or 5 fingers is recognized as a gesture.

Touchscreen gestures

Extend gestures to touchscreen.

Fluid gestures.

Allow the user to do different things without raising their fingers from the touchpad. This takes advantage of the fact that gesture execution is separated into 3 parts, start, update, and end, by doing complementary actions in each.

For example, assume the shortcut to switch windows is "CTRL + ALT" + direction, where direction is "LEFT", "RIGHT", "UP", "DOWN".

to switch windows fluidly, "CTRL + ALT" are held down programmatically on a start gesture event, then depending on which direction the user is swiping while fingers are still on the touchpad, dynamically generate direction commands. Then when the user raises their fingers, which is an end gesture event, the "CTRL + ALT" are released programmatically.