hajimehoshi / ebiten

Ebitengine - A dead simple 2D game engine for Go
https://ebitengine.org
Apache License 2.0
10.41k stars 635 forks source link

Gesture detection: high-level API for touch input #2566

Open sedyh opened 1 year ago

sedyh commented 1 year ago

Operating System

What feature would you like to be added?

At the moment, support for the touch is at a fairly low level. Because of this, most of the applications that come out under wasm almost always do not have touch support.

I suggest discussing the API for gesture detection based on the example we already have: https://github.com/hajimehoshi/ebiten/blob/main/examples/touch/main.go

It would probably be great to call it in the same format as we are currently checking the pressed keys. I'll try to make something like this later:

func IsTapping() (event Tap, ok bool) {}
func IsPanning() (event Pan, ok bool) {}
func IsPinching() (event Pinch, ok bool) {}
func IsSwiping() (event Swap, ok bool) {}

Why is this needed?

This will contribute to the support of the touch in a much larger number of games than now.

sedyh commented 1 year ago

Btw, can we upload touch example to ebitengine.org? Or it needs to be adapted to both touch and mouse?

hajimehoshi commented 1 year ago

Btw, can we upload touch example to ebitengine.org?

Hmm, this would work only with mobiles so I would not like to do so.

Or it needs to be adapted to both touch and mouse?

Yes.

sedyh commented 1 year ago

Hmm, this would work only with mobiles so I would not like to do so.

What about the case where it will work with any input (i.e. we can probably adapt it for using both with mouse and the touch)? I'm not sure, but it can be confusing for beginners, in the sense that there are many more examples in the repository than on the site.

hajimehoshi commented 1 year ago

IMO having a wrapper for mouses and touches would cause another confusion. I'd like to keep Ebitengine's API primitive as possible.

tinne26 commented 1 year ago

I think a lot of work and study is required before attempting to move this forward. Here are some concerns:

Intuitively, it seems to me that any practical API would be likely to unify mouse and touch (probably even different parts of both, like having both mouse wheel and touch pinch be linked to a zooming action) and take into account platform/user configuration (sensitivity and delays). But even the simplest practical models seem to derive into the higher abstraction of actions, which kinda falls outside the scope of Ebitengine.

If user configuration can be obtained and exposed independently, that may be all that's needed from Ebitengine to expose. Otherwise, it may be impossible to provide a third-party, high quality gesture detection library that doesn't fundamentally duplicate most of what Ebitengine is already doing. So we should start by looking into that first so we can determine what limitations a third-party implementation would have.

hajimehoshi commented 10 months ago

We need more investigation