Open spl237 opened 2 months ago
Just for more info, here's the libinput description of the touchscreen we are using:
Device: 4-005d Goodix Capacitive TouchScreen Kernel: /dev/input/event8 Group: 1 Seat: seat0, default Capabilities: keyboard touch Tap-to-click: n/a Tap-and-drag: n/a Tap drag lock: n/a Left-handed: n/a Nat.scrolling: n/a Middle emulation: n/a Calibration: identity matrix Scroll methods: none Click methods: none Disable-w-typing: n/a Disable-w-trackpointing: n/a Accel profiles: n/a Rotation: n/a
libinput seems to have two different classifications for touch devices, "touch" (which is a touchscreen) and "touchpanel" (which is a laptop touch panel separate from a screen), and the issue linked above seems to only talk about a touchpanel - is that significant?
the issue linked above seems to only talk about a touchpanel - is that significant?
You might have already read this because you talk about the different classifications libinput has, but according to this document libinput doesn't have the context needed for gesture support on touchscreens.
I'm not an expert - I learnt this a few weeks ago when I was wondering the same thing as you.
@spl237 I don't know much about libinput
, but offer the following:
I have the following in my config to allow me to tap-and-click to drag when using my trackpad. I find it a nice feature.
<libinput>
<device>
<tapAndDrag>yes</tapAndDrag>
<dragLock>yes</dragLock>
<naturalScroll>no</naturalScroll>
</device>
</libinput>
Ref:
@jp7677 - any advice on this (in your capacity as our in-house touch expert :smile:)
A touch screen is different than a touchpad/trackpad. It's similar to a tablet but physically matched to only one screen, works ok in single monitor use not so much in multi monitor use.
I understand that both gnome and kde handle touchscreens, but neither of them use wlroots, and I don't know how much work they had to do to get it to work.
@jp7677 - any advice on this (in your capacity as our in-house touch expert 😄)
I know something about tablets, but not so much about touch :)
Anyway, from a quick scan at the sway code here https://github.com/swaywm/sway/blob/master/sway/input/seatop_default.c#L806 I guess gesture support should be implemented in the compositor, which doesn’t (yet) has happened in Labwc.
@jp7677 - any advice on this (in your capacity as our in-house touch expert 😄)
I know something about tablets, but not so much about touch :)
Anyway, from a quick scan at the sway code here https://github.com/swaywm/sway/blob/master/sway/input/seatop_default.c#L806 I guess gesture support should be implemented in the compositor, which doesn’t (yet) has happened in Labwc.
That's interesting. Logically, it makes sense to implement it in the compositor - the only other option I can see is to implement something like GtkGesture handlers in individual apps, which is presumably what Firefox does, but it would make much more sense to have it in the compositor than to have to modify every app you want to use on a touchscreen.
I think it is both. Handle a gesture at compositor if there is a gesture binding. If not, forward to the client. At least that is what I understand from briefly looking at the sway code. But not sure if that covers your use case. I’m no gestures expert. May be you could try to set up your preferred gestures in Sway first.
You might get some support for touchscreens, ie click, etc, but more than that is doubtful
From libinput doc
Note that libinput does not support gestures on touchscreens
Touchscreen gestures are not interpreted by libinput. Rather, any touch point is passed to the caller and any interpretation of gestures is up to the caller or, eventually, the X or Wayland client.
Interpreting gestures on a touchscreen requires context that libinput does not have, such as the location of windows and other virtual objects on the screen as well as the context of those virtual objects:
That says that "...interpreting gestures requires context that libinput does not have, such as the location of windows..." - but the compositor does have that context, so ought to be able to do more than libinput alone can.
I'll have a play with Sway and see what it supports - that should at least give an idea of what is possible in the compositor.
Yes, the compositor or the app itself would have to do most of the heavy lifting. Libinput knows where the touch points are and passes that on, then the compositor would have to use that data to decide what to do.
Note: I haven't looked at sway to see how much their touchscreen implements.
Edit to add: Seen reference to this lately but have no idea how good it works https://git.sr.ht/~mil/lisgd
I'm trying to work out how to get some additional touchscreen support working - things like right-click emulation. Normally this is handled by something like tap-and-hold, but that doesn't seem to work by default except in certain applications (such as Firefox), which makes me assume said applications are supporting it themselves.
Is there some way to get something like tap-and-hold to work as right-click system-wide rather than having to implement it in every application? I did find https://github.com/labwc/labwc/issues/957, but that seems to apply to touchpads rather than touchscreens, and I can't get it to work with a touchscreen.
As always, any advice gratefully received!