Open icculus opened 1 year ago
I don't have a plan for this yet, and don't know what the OS-level interfaces look like. But at least on Android and iOS, it makes sense to report these as they come in at the system level, maybe as events?
I don't think we should try to emulate this on other platforms, by building our own on top of multitouch or mouse events or whatever. Those platforms just won't get the events (or any API will just return SDL_Unsupported()).
Let's discuss.
In #6758, @cblc said:
Do you think there would be a chance for accepting a PR for this in SDL2, or would it have to be posponed for SDL3? I wrote my own (Mac-only) implementation of this for wxWidgets a long time ago (before they supported it), and I think I could contribute a PR if there's unanimity on how the API should be. I have no idea on how this works on Windows or Linux, though.
I don't know if we would backport to SDL2 (but for a very simple interface: maybe). This would generally be intended for SDL3.
We definitely will accept a pull request (as long as it doesn't use any code copied from wxWidgets, of course), but I would definitely give a quick idea of what the interface would look like, and let people discuss here, before writing any code at all.
btw, there was: something here for macos: https://github.com/whisthq/SDL/pull/4 https://github.com/whisthq/SDL/pull/25/ (from https://github.com/libsdl-org/SDL/issues/7265)
Ooh, I think we have an open issue to sort through this fork, and this might be a good starting point. :)
The rotate gesture would also be very welcome (well, to be honest, I often find it a bit uncomfortable because it's like it has a big threshold and the rotation isn't triggered until you have rotated your fingers more than desired, but anyway it's a standard gesture, so it's desirable to have it as well).
FYI, I'm the author of #6137 and the author of https://github.com/Android-for-Python/gestures4kivy
For a high level view I want to share that package's doc for common gesture types and the user interaction on the three platform types https://github.com/Android-for-Python/gestures4kivy#api
And specifically inferring zoom/rotate:
Android and iOS propagate the component touches. In the context of interaction with other (including undefined) gestures this is I think the expected approach. The package above infers gestures in Python from the interaction of individual touch events.
The missing gestures that, using SDL2, a user cannot infer are on the Mac. This I suggest should be a priority.
Mobile devices require decoding a larger set of potentially interacting gestures, for the lowest common denominator list see the doc above.
Originally posted by @icculus in https://github.com/libsdl-org/SDL/issues/6758#issuecomment-1546928499