Open sedyh opened 1 year ago
Btw, can we upload touch example to ebitengine.org? Or it needs to be adapted to both touch and mouse?
Btw, can we upload touch example to ebitengine.org?
Hmm, this would work only with mobiles so I would not like to do so.
Or it needs to be adapted to both touch and mouse?
Yes.
Hmm, this would work only with mobiles so I would not like to do so.
What about the case where it will work with any input (i.e. we can probably adapt it for using both with mouse and the touch)? I'm not sure, but it can be confusing for beginners, in the sense that there are many more examples in the repository than on the site.
IMO having a wrapper for mouses and touches would cause another confusion. I'd like to keep Ebitengine's API primitive as possible.
I think a lot of work and study is required before attempting to move this forward. Here are some concerns:
GestureDetector
and iOS's UIGestureRecognizer
(I find the docs of this last one clearer). Any implementation may use those under the hood anyway, so this must be examined in more detail.gestures
subpackage. I am tempted to say that this would be better explored in an existing, third-party input library first, but we still don't know which primitives may be missing for this (e.g. the previously mentioned delay and responsivity configurations).Intuitively, it seems to me that any practical API would be likely to unify mouse and touch (probably even different parts of both, like having both mouse wheel and touch pinch be linked to a zooming action) and take into account platform/user configuration (sensitivity and delays). But even the simplest practical models seem to derive into the higher abstraction of actions, which kinda falls outside the scope of Ebitengine.
If user configuration can be obtained and exposed independently, that may be all that's needed from Ebitengine to expose. Otherwise, it may be impossible to provide a third-party, high quality gesture detection library that doesn't fundamentally duplicate most of what Ebitengine is already doing. So we should start by looking into that first so we can determine what limitations a third-party implementation would have.
We need more investigation
Operating System
What feature would you like to be added?
At the moment, support for the touch is at a fairly low level. Because of this, most of the applications that come out under wasm almost always do not have touch support.
I suggest discussing the API for gesture detection based on the example we already have: https://github.com/hajimehoshi/ebiten/blob/main/examples/touch/main.go
It would probably be great to call it in the same format as we are currently checking the pressed keys. I'll try to make something like this later:
Why is this needed?
This will contribute to the support of the touch in a much larger number of games than now.