Open xorgy opened 5 months ago
There are a few layers to this. The simplest layer is just to enable mouse-like behavior from touch, so that touch can operate widgets like buttons and sliders, and also selection (and focus) in text input.
Another more sophisticated layer is to support multi-touch well. That has potentially far-reaching changes, as multiple widgets can be active at the same time. A careful design is needed.
And yet another more sophisticated layer is gesture disambiguation and recognition. A touch-drag gesture on a slider widget should move the slider, but on mobile, the same gesture not on a suitably sensitive widget should be interpreted as a scroll. There may be platform-specific gesture recognition code we want to call into, or it may be that we want a pure Rust cross-platform implementation of all this, tunable to feel native on the specific platform.
There may also be improvements desired on the winit side. I see a bunch of PRs to add pen support (https://github.com/rust-windowing/winit/pull/2647 is the most recent) but they seem to be stuck in review. https://github.com/rust-windowing/winit/issues/99 seems to be the main tracking issue there.
To be clear, I think we should have the simplest layer first, so we can unblock simple UI interactions on mobile, then carefully design what we want to do for the other two.
How much progress have we made on the Android side of this?
@PoignardAzur
Winit has merged their unified PointerEvent interface upstream, and 0.31.0 will have this; at my request they have also slated click counts on winit PointerEvent
s for 0.31.0.
I think at this point it'd be best to just keep what we have, and when we move to winit 0.31.0 we can make things more elegant (and delete a bunch of code).
We still have to work out properly intercepting PointerEvents from touch (and maybe pen) devices for gestures. On the web, the UA handles this under the hood. Handling touch gesture recognition properly will still involve retaining the full history of a touch (timestamp, position, and force) between Start
and End
/Cancel
There is some consensus that a PointerEvent-style API (like existed in Glazier) is probably a good way to handle touch and mouse in general. This is a tracking issue for that topic.