Open huntekye opened 1 year ago
This sounds very much like the app has both mouse and touch events active, and both are triggering actions simultaneously from input gestures.
Generally speaking, the mobile iOS and Android builds use only touch events and there are no mouse events occurring at all. Likewise, on desktop platforms, there are usually only mouse events being sent when interacting with a pointer device. I'm guessing the situation is a bit confused on the PinePhone. Perhaps it's sending both mouse and touch events at the same time.
IIRC, the iOS and Android builds are hardcoded to use touch events. To solve this problem on a mobile Linux OS, there would need to be a way to (perhaps dynamically) switch between mouse and touch input modes, where the "wrong" kind of events would be ignored.
Yeah that is what it seems like is happening. In sway one can check the currently configured input devices with swaymsg -t get_inputs
, and for me there is only one device with a type other than keyboard, the touchscreen input device, but I'm not sure how one would do this for other Wayland based WMs. Running wev
shows that each finger only gets one event when it touches the screen, and one when leaves the screen (and a whole bunch when moving on the screen), but this is a fairly high level look at the events, I could see them getting doubled somewhere later (maybe by SDL).
SDL has a feature where touch events can be used to emulate mouse SDL events, and vice versa. This is because many SDL apps are written with only a mouse in mind, so they need to emulate mouse input based the native touch events.
Lagrange has full support for both mouse and touch SDL events, but the code does not account for this particular mobile Linux environment.
What I could try is adding a build setting that applies some of the same behaviors that are used in iOS/Android, so you can at least manually compile the app for a touch-only Linux "desktop".
Ah okay that makes a lot of sense! If Lagrange natively supports touch events, is there no way disable SDL emulating a mouse? As far as I can tell that feature is not actually needed for anything in Lagrange, though I imagine you know more about that than I do.
A build setting would be great! IMO a command line flag would maybe be more user friendly, as it can be misleading to think of these as touch-only devices. It is not super uncommon for people to hook up a big monitor, keyboard, and mouse and basically use the phone as a tiny, portable computer tower (though I don't do this much myself), in which case having to compile two versions would be a little inconvenient. That said, idk if one of a build setting or command line flag is significantly more difficult, and either is a workable solution!
All those issues are present in v1.16.7 on a Windows 10 touchscreen device.
For the record, I've never used a touchscreen Windows device so I haven't had the opportunity to address that configuration.
Same as with the Linux case, it should be fairly straightforward to address this by either processing only mouse or touch events at a given time, but the app's event handling code will need some light massaging to make that work correctly. I'll have to think about how to implement this with the devices I have at my disposal...
Hi!
Thanks (again) for making this great client for Gemini (and friends)! I am experiencing some issues with using Lagrange on a device with a touch screen: A pinephone pro running Linux v6.4.1-1-danctnix (a derivative of Arch) with the sway window manager (Wayland based).
This isn't every possible combination, but probably most of the important ones.
Somewhat unrelatedly, pinching does work to adjust the UI scaling, though the responsiveness is a little laggy.
I don't really know how much you have planned what the result of those actions should be on a touchscreen, or even how much control you have over how they work, but I think something like the following is what I would expect intuitively:
If there's any other information that would be useful, just let me know!