Open mauriciabad opened 1 year ago
Thank you for suggesting this interesting feature. I am just not sure if libinput-debug-events can track the position of each tap with multiple fingers. If it is possible, we can implement tip-tap as a fusuma-plugin.
I am not entirely sure to follow.
The main request here was to implement a way to have the three-finger text highlight feature available in some way. This is said to be possible now.
However:
So I am a bit confused. It does not seem that it is possible to implement a three finger gesture to highlight text, still. Implementing a drag (move window) or a swipe (like show "expose" etc) is not requiring any scripting: it can be done natively in gnome for the most part.
It is really the gestures like:
I can achieve the backwards navigation with three finger gesture, not a big deal, using either yodtool, native features or fusuma-plugin-sendkey. As well the drag-to-move window, either user native features or either of the ydotool or native features.
But the three-fingers-to-highlight, to me, seem still absolutely unachievable.
What do I miss? Does ydotool have some hidden feature to grab mouse positions dynamically? Or fusuma a hidden tool to send mouse commands?
Cheers!
@smileBeda It is not a 3 finger gesture, it is a 2 finger one.
The difference between a "normal" 2 finger tap, is that both fingers don't touch the trackpad at the same time, instead, one goes first and some time after (for example 400ms), the other one comes in. Then, based the tap order of the fingers and their x-coordinate, the software triggers tip-tap left (if right finger 1st, left finger 2nd) or tip-tap right (if left 1sr, right 2nd)
That is the gesture part. And then, there's the action part. Here I'm just requesting the gesture, but obviously, if it comes with some actions it's better. In the GIF it rearranges windows, but that's a random GIF I found on the internet to show an example. What I would configure it to do is to trigger Ctrl + C and Ctrl + V, which seems trivial to me and I assumed it is already supported.
Now, I switched to macOS, and I use an app called Better Touch Tool, which is the best gestures app by far (that I found for any OS).
Take a look at the options it has, I recorded a video for you. It has almost all possible (and usable) gestures doable in a trackpad with up to 5 fingers. I also show some of the actions it supports.
https://github.com/iberianpig/fusuma/assets/12821361/1d5e7260-bc64-430f-b05b-380164690bb1
Sadly Linux has a HORRIBLE trackpad usability, and that's my main way of interacting with the laptop, so I expect it to be perfect. I can't believe that the basic scrolling gesture is broken, it scrolls extremely fast and there's no option to adjust the sensitivity. The zoom gesture is also broken, and many things more... That's one of the reasons why I chose macOS. And now I won't be in constant pain.
I say this because I won't use this feature if you implement it, because I don't use linux anymore. But anyways, it would be a great adition, and the start of fixing the linux trackpad problem. (no library can do this kind of gestures in linux)
... which is why I am so desperate for it on linux lol
Anyway, I got working ... default out of the box on X server. Forget wayland. Also got most of the Mac gestures into it using this tool, and another (kinto)
But.. probably moving back to MacOS. Battery is a catastrophe on linux. So is half of the apps available. I guess it is the drawback of FOSS. Either pay, or pain
Haha...
I believe we need an event recognition and translation layer that does not rely on libinput, but rather uses evdev. This is necessary because libinput itself does not report touch positions.
Currently, Fusuma operates based on libinput, but we could create plugins that utilize evdev events as needed.
For example, some functionalities of fusuma-plugin-thumbsense and fusuma-plugin-remap are implemented using evdev. This idea involves creating pairs of physical and virtual keyboards, and either rewriting the communication between them or using it as input for Fusuma.
Similarly, we can create pairs of physical and virtual touchpads, rewriting the communication between them. This approach would enable us to accurately detect touch positions and convert them into arbitrary events.
This means, by reporting touch positions while filtering the touches passed to the Wayland compositor's libinput, Fusuma could support more flexible gestures dependent on touch positions.
Feature explanation
I'd like to use the Tip-Tap gestures (like the one in Better Touch Tool).
It's a very simple gesture and very versatile. Let me highlight some of it's benefits:
Example useful actions that can be assigned:
What is Tip-Tap gesture?
Video: https://www.youtube.com/watch?v=MRhBa3-O3fM