Closed marc2332 closed 1 year ago
Just looking at some existing things:
click
events are tied to keyboard, touch, pen, and mouse-related behavior. Other mouse events (e.g. mouseup
) can only be triggered by the mouse (not touch).
click
event is fired if either onpointerdown
would fire when both onmousedown and ontouchdown would.
QPushButton
has a clicked
listener that acts simiarly to how onclick
works in browsers.clicked()
method seems to respond to touch events.pressed
event that fires both from touch and mouse.TouchArea
element. https://slint-ui.com/releases/1.0.2/docs/slint/src/builtins/structs.html#pointereventclick
event works similar to the previous cases, and is derived from its button's activate
event.action
argument that fires whenever the button is clicked or pressed.Generally from what i've seen:
onmousedown
can't be triggered from touch.pressed
or action
event) isn't specific to just mouse input, and can be triggered a few ways.Just looking at some existing things:
* On web platforms, specifically `click` events are tied to keyboard, touch, pen, and mouse-related behavior. Other mouse events (e.g. `mouseup`) can only be triggered by the mouse (not touch). * A `click` event is fired if either Enter is pressed or Space is released when a focusable element is in focus, a touch release is registered within a specific time-frame, or the left mouse button is released. * To unify other types of input and avoid adding separate listeners for different types of input, there's currently a [W3C proposal](https://www.w3.org/TR/pointerevents/) for _pointer_ events that combines all types of pointing device input (pen/mouse/touch) into one set of events. For example, `onpointerdown` would fire when both onmousedown and ontouchdown would. * React native supports pointer events experimentally. https://reactnative.dev/blog/2022/12/13/pointer-events-in-react-native * Since Qt5, there's been PointerEvent support along with raw mouse events. Qt's native button component, `QPushButton` has a `clicked` listener that acts simiarly to how `onclick` works in browsers. * egui's `clicked()` method seems to respond to touch events. * Both Xilem and Druid don't support touch yet, but Iced has a `pressed` event that fires both from touch and mouse. * Slint uses pointer events under the hood for their `TouchArea` element. https://slint-ui.com/releases/1.0.2/docs/slint/src/builtins/structs.html#pointerevent * [GTK's `click` event](https://docs.gtk.org/gtk4/signal.Button.clicked.html) works similar to the previous cases, and is derived from its button's `activate` event. * SwiftUI's buttons have an `action` argument that fires whenever the button is clicked or pressed.
Generally from what i've seen:
* Most toolkits tend to keep events specifically labeled as "mouse-events" mouse-only. E.g. `onmousedown` can't be triggered from touch. * Some also have more general "pointer" events that handle both mouse and touch (and pen) simultaneously so people don't have to maintain separate listeners for each type of input. * The "click" event (which is sometimes called a `pressed` or `action` event) isn't specific to just mouse input, and can be triggered a few ways.
Nice, thanks for this explanation 😄 💯
Pointer events are the way to go IMO, I had a look at them a few months ago and I liked them
Create a new set of events that unify the mouse and touch events
More info: https://www.w3.org/TR/pointerevents/