At Figma, most of our users pan and zoom the canvas inside our app using trackpad gestures. But we would like to offer a separate UI affordance for pan/zoom that only appears for users who don't have a trackpad. Or, to put that in more generic terms, it would be useful to be able to write a media query that determines where the user's input device supports pan/zoom gestures. There doesn't appear to be an existing query that reliably covers this scenario.
I’m assuming that these would be values of a more general gesture query. Perhaps, this could be generalized to the number of fingers/points that can be tracked distinctively for recognizing gestures:
0: only taps or clicks and movements recognized, no gestures
1: dragged mouse gestures pioneered by Opera, also made with stylus or single finger
2: common finger gestures on touchscreens and trackpads, popularized by the original iPhone, including the requested pinch gestures
3+: mostly swipe gestures as variants of 2-point gestures
At Figma, most of our users pan and zoom the canvas inside our app using trackpad gestures. But we would like to offer a separate UI affordance for pan/zoom that only appears for users who don't have a trackpad. Or, to put that in more generic terms, it would be useful to be able to write a media query that determines where the user's input device supports pan/zoom gestures. There doesn't appear to be an existing query that reliably covers this scenario.