Closed canadaduane closed 2 years ago
I'm curious if you've considered using libinput rather than interfacing directly with evdev?
I rejected this at the outset for various reasons. keyd is intended to be a low level remapping tool that is invisible to things which are higher up on the input stack. Furthermore, libinput is just a library used by numerous compositors (and an X driver) and isn't exposed outside of them (wayland regards this as a security risk). Even if it could be implemented on top of libinput it would cease to work on anything which doesn't use it (e.g VTs).
I learned recently about how gesture remappers like gebaar and Touchegg are similar to key remappers. I've diagrammed the "overview" from the perspective of a touchpad here (the idea is exactly the same for keyboards). In essence, instead of opening evdev file descriptors directly, gesture remappers have to rely on libinput to detect the gestures before they can be remapped. Then, the next layer in the stack opens their "virtual device" and uses that input instead of the "real" device.
This may make more sense for touchpads which have more complicated semantics and are used exclusively in the graphical environments, but I'm not sure it makes sense in this case.
And yet, it seems like input devices generally are often related ..#42..proof of how they are related in sometimes surprising ways.
Aside from palm rejection, there aren't too many cases in which the pointer should care about the keyboard. To the extent that it does, I agree that this should be handled higher up the input stack. The solution to this is to ensure libinput associates keyd's virtual device with the appropriate touchpad, not to rewrite the entire project on top of a higher layer of the input stack (which is less portable).
libinput is now used almost universally in X.Org (Xserver) and Wayland compositors (Weston, mutter, KWin). The one place it may not be universal is in text terminals.
Yes, this matters. There may also be instances of people using old X drivers.
So there are clearly some benefits to using libinput as a source of events, at least at the boundary around something like keyd:
I don't think this has been clearly demonstrated.
You get some normalization for free (for example if combining a keyboard with a mouse--libinput has a hardware database that provides accel curves and other benefits to make mice behave)
This is why keyd tries to avoid managing your mouse.
You get beneficial device interactions for free (e.g. touchpad "disable while typing")
Can you name another one? This can and should be fixed by treating the keyd virtual keyboard as just another input device and configuring libinput accordingly.
You get support for both major graphical desktop systems (granted, this is true of evdev too)
Yes. evdev was chosen by design. It was originally intended to emulate open source keyboard firmware (which are desirable because they work everywhere).
You can start to piece together advanced remapping scenarios where gestures and keys combine to make other possibilities (imagine "Ctrl + 3-finger-swipe right" for instance).
This should be (and has already been?) done higher up the input stack. It is orthogonal to keyd's goals.
Edit:
Apologies, I should have scrutinized the references you listed more carefully before responding. It looks like such libinput remappers are indeed possible, though the rest of my points remain.
Edit 2:
It looks like said 'remappers' are just passively monitoring events and then effecting changes higher up the input stack.
Continued offline.
Hot off the heels of #66... :)
I'm curious if you've considered using libinput rather than interfacing directly with
evdev
? I know this would imply a large rewrite of some key parts of the code, but for the sake of imagining possibilities, perhaps we could consider it just for a moment?(This is also related to linuxtouchpad discussion -- the last question in particular, "should key remappers be implemented at the same level as touchpad remappers (like Touchégg and Gebaar)?")
libinput
to detect the gestures before they can be remapped. Then, the next layer in the stack opens their "virtual device" and uses that input instead of the "real" device.libinput
handles keyboard events, mouse events, touchpad events, touch screen events, and tablet events. These events are all at a fairly "low level" in the sense that there is some normalization going on, but not a lot of interpretation. And yet, it seems like input devices generally are often related--sometimes multiple kernel devices are the same physical device, and sometimes two or more devices interact. In fact, #42 and #66 are sort of proof of how they are related in sometimes surprising ways.libinput
is now used almost universally in X.Org (Xserver) and Wayland compositors (Weston, mutter, KWin). The one place it may not be universal is in text terminals.So there are clearly some benefits to using
libinput
as a source of events, at least at the boundary around something likekeyd
:libinput
has a hardware database that provides accel curves and other benefits to make mice behave)So I guess I'm just spitballing a bit here, but am I missing anything important? Let's say keyd 3.0 uses
libinput
and we're all done. Is it a net improvement?