Open dburnsii opened 4 years ago
CC @aperezdc @zdobersek
I’m having the same issue with Wayland / Weston / wpeWebkit / COG on arch linux arm - raspberry pi 4
I tried playing with settings in the Weston.ini for libinput but I think that’s for like a laptop touchpad? Not sure.
A good test I found for this issue is here: https://patrickhlauke.github.io/touch/tracker/multi-touch-tracker-pointer-hud-toucheventsonly.html
In Firefox and in Chrome under Wayland, touch events are recognized and tracked normally. In Cog, only the first touch is recognized, and it seems to fail after the first move event on that touch point.
anyone have any luck fixing this issue?
The described problem is due to the scrolling gesture implementation in WebKit. In short, any unhandled touch event is relayed to this gesture generator so that it is turned into a wheel (scroll) event. Problem right now is that it doesn't translate the whole touch interaction (touch-down -> touch-motion -> touch-up), instead it's possible for the initial touch-down event to be handled by the Web content, but the subsequent touch-motion event can trigger that gesture.
On the linked demo, this happens due to "debouncing" the touch-motion events. Only some touch-motion events are handled, for performance reasons, and the ones that are not cause this faulty interaction.
The mentioned WebGL application probably did something similar, but the subsequent scrolling gesture was what was giving out the pinch-to-zoom effect.
The initial fix is posted in this WebKit bug: https://bugs.webkit.org/show_bug.cgi?id=218329
Beyond how WebKit internals handle these events, there's general room for improvement in Cog. For the DRM backend, we are able to use libinput and dispatch the touch input data upon the TOUCH_FRAME event, which is convenient in order to incorporate all the relevant touchpoint data into a single event. Same should be done for the FDO backend, problem is some compositors (like GNOME Shell, at least version 3.30) don't dispatch the wl_touch@frame() event that could be doing the same, so some workaround mechanism would be required for those cases.
Currently, touchscreens don't seem to function as they should. It appears that on touching and dragging, the initial touch point is stored as an "active" point, and as a finger moves across the screen, the current location acts as a second "active" point. This results in a drag acting more like a pinch-to-zoom, with the initial touch point being the pivot point. I tested this on multiple machines both in Weston and in Sway, issue seems to be the same in both. This also prevents pinch-to-zoom from working, since there in theory 4 points at play as opposed to the expected 2.
This issue was noticed in a webgl application that has drag-to-pan and pinch-to-zoom features, both of which work in Firefox under Wayland.