adventuregamestudio / ags

AGS editor and engine source code
Other
696 stars 159 forks source link

Enhancement: Proper touch input support #1538

Open ericoporto opened 2 years ago

ericoporto commented 2 years ago

AGS assumes that the player uses a mouse to play the game. In today times, there is a myriad of devices that support touch input that are used quite frequently, famously our mobile ports should be able to run in such devices. As developers mature they would need to be able to cater specifically for the constraints of such devices. The assumption of a mouse that can click can make some control schemes challenging to support in touch first devices.

These are drawn from my observations of other engines, that I wrote about it here: https://ericonotes.blogspot.com/2020/11/a-quick-look-at-touch-handling-apis-in.html

Such API additions would allow for multitouch in GUIs, which can be used to better handle mobile device usage and respond more quickly in the interface (hitting two fingers at screen today will result in a finger being dismissed), and also this allows support for on screen joysticks.

Forum topic

On Screen Joystick example

Using AGS GUIs, it should be possible to construct something like below. Right now, it's not possible since using two or more fingers at the same time is not possible.

touch joystick

Note

https://youtu.be/B_IqYy4T_AA?si=xOASxzLCrI0F1lV8&t=916

In this talk, the Broken Sword dev talks about the recent mobile port and it's interesting how their interface got adapted.

ivan-mogilko commented 2 years ago

Just for a quick reference on SDL2 touch control support:

SDL2 touch-related events: https://wiki.libsdl.org/SDL_TouchFingerEvent https://wiki.libsdl.org/SDL_MultiGestureEvent https://wiki.libsdl.org/SDL_DollarGestureEvent


There is a couple of hints that control SDL2 behavior when it comes to simulating touch and mouse through each other. Some hints are missing in the SDL2 wiki, but their brief description may be found in the source code. (https://github.com/libsdl-org/SDL/blob/main/include/SDL_hints.h):

A variable controlling whether mouse events should generate synthetic touch events

https://github.com/libsdl-org/SDL/blob/1fc7f681187f80ccd6b9625214b47db665cd9aaf/include/SDL_hints.h#L1146-L1150

A variable controlling whether touch events should generate synthetic mouse events

https://github.com/libsdl-org/SDL/blob/1fc7f681187f80ccd6b9625214b47db665cd9aaf/include/SDL_hints.h#L1514-L1522

I guess SDL_HINT_TOUCH_MOUSE_EVENTS is the one that matters more for the mobile devices. If AGS has its own proper touch API in script, then this hint should likely be disabled for games with such support, and enabled for games without such support.

When receiving mouse events you can distinguish real mouse from emulated using "event.button.which" parameter, which means mouse ID. For emulated mouse ID = SDL_TOUCH_MOUSEID.


The SDL2's implementation of synthetic mouse events may be found in the following code (in latest version): https://github.com/libsdl-org/SDL/blob/main/src/events/SDL_touch.c under SYNTHESIZE_TOUCH_TO_MOUSE macro test. e.g. as of recent commit: https://github.com/libsdl-org/SDL/blob/970344719a958460aadd73342a2b0524981a59f4/src/events/SDL_touch.c#L266

ericoporto commented 2 years ago

Stranga pinged me again about this today, I mentioned I believe this is something for a 3.6.1 release.

ericoporto commented 1 year ago

So in AGS currently the mouse can imply in on_mouse_click only when nothing is blocking, and when something is blocking the mouse goes through the skipping stuff. This is because on_mouse_click (and any event in AGS!) can't run when something is blocking (similar to repeatedly_execute and not repeatedly_execute_always), this means that whatever is the api added, we also need to add touch as a way to skip things in all things that can be skipped (video, cutscene, speech, ...), and similar to mouse click, a touch anywhere on screen will cause it to skip.

ericoporto commented 6 months ago

Trying to come up with a minimalistic approach

builtin struct Pointer {
  /// Number of pointers, this is a fixed amount
  readonly import static attribute int Count;      // $AUTOCOMPLETESTATICONLY$
  /// Takes pointer ID and returns where the pointer is in game screen, (-1,-1) if invalid
  readonly import static attribute Point* Position[];      // $AUTOCOMPLETESTATICONLY$
  /// Takes pointer ID and returns true if the pointer is pressed, the finger is on screen or left mouse button is down
  readonly import static attribute bool IsDown[];       // $AUTOCOMPLETESTATICONLY$
};

Here's how it works

Here is it's initial version: https://github.com/ericoporto/ags/tree/experimental-pointer-api

ivan-mogilko commented 6 months ago

I'm concerned about the bare "Pointer" name, this term has many uses in programming. Are there other alternatives to this? If not, perhaps adding something to it may clarify the purpose. A quick example: "PointerDevice".

ericoporto commented 6 months ago

I agree it's a terrible name, could also go with "TouchPoint", "Interaction" or "TouchInput".

I would like to somehow have the mouse input be one of the things there just to make it easier to iterate through testing Game Script code in the AGS Editor.

I also thought about the API being like, say in System.TouchPoint[] and then returning an object that is just data, like Point is, so holding a reference to it isn't a problem. I think I would need to create my own ScriptUserObject like Point - something like ScriptStructHelpers::CreateTouchPoint(int x, int y, int id, bool down), but need to really think through - it looks a bit problematic if I need to add something in it.

ericoporto commented 6 months ago

My first test of the thing

https://ericoporto.github.io/public_html/382d947/

It also looks like my screen position calculation is completely wrong.

ericoporto commented 6 months ago

I renamed to touch points (I haven't renamed the files too yet, but will eventually)

managed struct TouchPoint {
    int ID, X, Y;
    bool IsDown;
};

builtin struct Touch {
  /// Number of pointers, this is a fixed amount
  readonly import static attribute int TouchPointCount;      // $AUTOCOMPLETESTATICONLY$
  /// Takes pointer ID and returns where the pointer is in game screen, (-1,-1) if invalid
  readonly import static attribute TouchPoint* TouchPoint[];      // $AUTOCOMPLETESTATICONLY$
};

I still can't figure the screen position calculation. Trying to pickup things from mousew32, because I would want the same position one gets from mousex and mousey globals. Edit: something like https://github.com/ericoporto/ags/commit/20c6bb7159f1679700ab8e490a0e35950cbadd54

Edit: testing in a few devices and it almost works, except it's not being clamped to the game borders. Edit: https://github.com/ericoporto/ags/commit/7c1febc48e3257a26f8de5260f4b998bfd297d87 fixed it!

ericoporto commented 6 months ago

Hey, I would like to try to add multi-touch support to GUIs... but... I can't figure even how they are clicked at all.

https://github.com/adventuregamestudio/ags-template-source/blob/c9f8c8ebde194ad026d095a7e6a30ef6fdd10fe5/Empty%20Game/GlobalScript.asc#L69-L85

There isn't anything for GUIs in Global Script, does it happen through some internals?

I know buttons can be held when clicking, and the event that we normally use happens on release. With multi-touch, I would like to keep that working in multi-touch, and also to add some event it triggers continually while the button is held.

Ah, I think I need to implement some variance of this for multi-touch

https://github.com/adventuregamestudio/ags/blob/466acb5ec274f4cef5f809a304c81e7922f95051/Engine/main/game_run.cpp#L351

ivan-mogilko commented 6 months ago

Right, GUIs are not handled in script at all, this is done purely by polling GUI state: see GUIMain::Poll, where it decides which controls are under mouse, and which events to send (mouse over, pushed, unpushed etc).

EDIT: I did not think about this earlier, but I'd assume that the touches may be seen as extra "mouse" devices. So instead of checking a single mouse device, engine should have a list of devices (or rather list of specific device state, including coordinates and "button" state), and check all of them in a loop. Similarly, any global variables that mention mouse state, such as "was_mouse_on_iface" should be rewritten to support multiple devices. IDK how the game should react if there are multiple touches on buttons though. Maybe that should not be supported, and only the first or the last touch should result in a control activation?...

EDIT2: There's another thing, mouse event such as on_mouse_click currently only includes button as an argument, but that's wrong, it should also include at least position saved at the time when event was registered. That's important to have on its own (as mouse may be moving between updates), but even more important with multiple "devices" that may be pressed during same update in different positions.

In other words, it should be:

on_mouse_click(MouseButton button, int mx, int my)
on_touch_down(? touch_id, int tx, int ty)

or similar.

ericoporto commented 6 months ago

Uhm... I made my TouchPoint like this

managed struct TouchPoint {
    int ID, X, Y;
    bool IsDown;
};

Perhaps if the bool IsDown is instead an enum MouseButton ButtonDown that is always either none or left click for fingers but can take other buttons it could work in a generical way.

But yeah, the multiple mouse devices approach, even if only internally to ags could work well. Game maker works in this manner.

Because of the nature of AGS being a Point-and-Click engine the mouse is integral to it's behavior and affects a lot of things. We do a lot of global assumptions around having a single mouse device.

IDK how the game should react if there are multiple touches on buttons though. Maybe that should not be supported, and only the first or the last touch should result in a control activation?...

Yeah, I think so, the first touch in the gui control marks it as being pressed down and only once the last touch point on top of it released the gui control is released.

see GUIMain::Poll, where it decides which controls are under mouse, and which events to send (mouse over, pushed, unpushed etc).

AH, that is the place. In game_run.cpp we call it passing the global mousex and mousey positions. For now I think my approach is to still do that and additionally call it repeatedly with a for that would go through all the fingers that are down.

I think when polling it may not be necessary to tell the ID of the finger, but instead tell it's in the same frame? Meaning, that presses in the same frame could only affect the control state once. I still need to think a little more on this.


Edit: it looks like ID of the finger would only be useful to skip processing of the finger that hasn't moved.

Also looking more at the code it looks like the highlighting (HighlightCtrl) would still be kept at one control only. There is a focus (FocusCtrl) thing, but it looks like it's not actually implemented currently.

ericoporto commented 6 months ago

Uhm, there is a behavior that makes sense for Mouse but doesn't make sense for Touch, that kinda tells me we want to have different Polling for this. It's this click, hold that locks the button in down state.

click_lock

Now in a touch environment, this doesn't happen, the button gets released as soon as the finger is not on top of it, but with mouses this is not expected. I think this signals that we would have two Polling, one for mouse and other for touch.

Going back to the script api, cirrus-ci.com/build/5713709612400640 | ericoporto@ags/experimental-touch-api

ivan-mogilko commented 6 months ago

Now in a touch environment, this doesn't happen, the button gets released as soon as the finger is not on top of it, but with mouses this is not expected. I think this signals that we would have two Polling, one for mouse and other for touch.

Different behavior is better done with either a flag that tells how the device should act, or virtual function(s) overridden in a device class. Having multiple polling loops will complicate code organization (and potentially there may be other differences found in future).