minetest / irrlicht

Minetest's fork of Irrlicht
Other
115 stars 87 forks source link

SDL: Implement touchscreen support #262

Closed grorp closed 10 months ago

grorp commented 11 months ago

This PR adds touchscreen support to the SDL device. It is a prerequisite for migrating the Android build to SDL.

Requires minetest/minetest#14117 and minetest/minetest#14118 to work correctly.

To do

This PR is a Ready for Review.

Compared to CIrrDeviceLinux on the same device, I experience two problems:

  1. SDL_FINGERDOWN events are delayed, they only arrive after a short delay or at the latest together with the corresponding SDL_FINGERUP event

  2. There are still emulated mouse events even though I disabled them will be fixed by minetest/minetest#14118

Since issue no. 1 only happens on my Linux system, I have no idea how to fix it and it is not a dealbreaker, I declare this PR ready anyway.

How to test

Buy a Linux/Windows/macOS device with a touchscreen.

Compile Minetest with USE_SDL2=TRUE and ENABLE_TOUCH=TRUE. Verify that everything works like on Android.

grorp commented 11 months ago

There are still emulated mouse events even though I disabled them

Hmm, this is fixed if I comment out the SDL_SetRelativeMouseMode calls (but I can't do that for obvious reasons). Looks like SDL starts emulating mouse events anyway once relative mouse mode is enabled?

grorp commented 11 months ago

I attached the same external touchscreen to a Windows device. Issue 1 wasn't present there, but issue 2 was.

sfan5 commented 11 months ago

Hmm, this is fixed if I comment out the SDL_SetRelativeMouseMode calls (but I can't do that for obvious reasons).

I'm thinking we have to transparently disable all touch support as soon as a mouse move or click is detected and re-enable if touch input is detected. (essentially what I'm expecting the engine to do too once it supports both)

rubenwardy commented 11 months ago

Let's distinguish between touch in gui and touch controls. It should be possible to use the touch like a mouse in the GUI without enabling touch controls in-game.

But if you use touch when menus are closed, then it should automatically enable touch controls. And it should disable if you use mouse or a gamepad

Sounds like disabling touch emulates mouse input, which is what you want

grorp commented 11 months ago

I'm thinking we have to transparently disable all touch support as soon as a mouse move or click is detected and re-enable if touch input is detected. (essentially what I'm expecting the engine to do too once it supports both)

I agree, but this isn't related to this PR. Concept-wise, minetest/minetest#14075 seems like a good first step in this direction.

Let's distinguish between touch in gui and touch controls. It should be possible to use the touch like a mouse in the GUI without enabling touch controls in-game.

I agree and this would be a very simple change in Minetest. However, this is also not related to this PR. This PR just makes SDL support touch events the same way Android and Linux already support touch events. Whether Minetest uses these touch events or not and how it uses them is a different topic.

Sounds like disabling touch emulates mouse input, which is what you want

In case you're you referring to my problem no. 1: No, it isn't what I want, as it results in all touch events being processed twice.

After setting SDL_HINT_TOUCH_MOUSE_EVENTS and SDL_HINT_MOUSE_TOUCH_EVENTS to 0, the only case where there are still emulated mouse events is when relative mouse mode is enabled. Since relative mouse mode isn't needed by the touchscreen controls anyway, we can probably just disable it when touchscreen controls are enabled.

sfan5 commented 11 months ago

I agree, but this isn't related to this PR.

It was meant to be a way to fix the issue.

"touch input" mode -> ignore mouse input, don't ever call SDL_SetRelativeMouseMode "mouse input" mode -> ignore touch input, calling SDL_SetRelativeMouseMode is okay

but since Irrlicht does not have an on-off switch on which kind of input we want, we can switch between these modes depending on what input was last used.

grorp commented 11 months ago

but since Irrlicht does not have an on-off switch on which kind of input we want, we can switch between these modes depending on what input was last used.

We already have a much easier way to know which mode to use: the ENABLE_TOUCH build flag in Minetest. Once touchscreen mode is changeable at runtime, we can of course use whatever runtime detection we have for the SDL_SetRelativeMouseMode case as well.

numberZero commented 11 months ago

the ENABLE_TOUCH build flag in Minetest

...which is of course not available in Irrlicht.

grorp commented 11 months ago

...which is of course not available in Irrlicht.

...which is why I put the fix into Minetest: minetest/minetest#14118

sfan5 commented 10 months ago
  1. SDL_FINGERDOWN events are delayed, they only arrive after a short delay or at the latest together with the corresponding SDL_FINGERUP event

fixed yet?

grorp commented 10 months ago

fixed yet?

No, but I've done further testing:

CIrrDeviceLinux CIrrDeviceSDL CIrrDeviceWindows
Linux (X11) bug present, but hidden by another bug* bug present n/a
Linux (Wayland) bug not present bug not present n/a
Windows n/a bug not present n/a (no touch support)

* An EMIE_MOUSE_MOVED event arrives before the delayed ETIE_PRESSED_DOWN event. Pressed buttons first enter the "hovered" state, then (after the delay) the "pressed" state. This makes the delay less noticeable.

The fact that an EMIE_MOUSE_MOVED event occurs is a bug, as no mouse movement has happened.

Since the bug is already present with CIrrDeviceLinux (on Linux/X11) and is never present on Linux/Wayland, this looks like an X11 input issue. I doubt I can do something about this bug because the events arrive already delayed from SDL.

So I'd consider this PR ready and the bug a "wontfix".