richardanaya / conifer

Apache License 2.0
37 stars 4 forks source link

Supporting double tap, long tap, drag, etc. #3

Open richardanaya opened 4 years ago

richardanaya commented 4 years ago

@nbrr i know we still have auto detection going on but wanted to start a new thread of discussion on supporting more mobile types of interaction.

nbrr commented 4 years ago

The first step is to get cleaner data from the input. I've worked in this direction today: I tried to abstract that points arrive by fragment. The loop seems simpler now. I removed the notion of touch being active/inactive and feed the callback with a point only when a full point has been read, but I expect the information to be retrieved in another form: I plan to give a similar implementation for swipes (a vec of points + a notion of the swipe being in progress or done). There we should be able to interpret a swip as a long tap (e.g. movement is limited and duration is long), drag (displacement vector between the first and last point of the swap). If we keep a few swipes in memory we can get double taps. What do you think of this approach? I'm sure there must be some existing crate implementing something similar but I don't really know what to look for.

I've had a hard time settling with this being move. I meant it to be &self mut but the borrow checker won. DIdn't bother me since I was writing FP-style anyway, but I think it might be relevant for this to be &mut?

Note that with this approach the multi touch should be easy to add, I think.

richardanaya commented 4 years ago

Everything you said sounds logical. I like your point all can be derived from swipes.

On Wed, Aug 26, 2020 at 2:45 PM nbrr notifications@github.com wrote:

The first step is to get cleaner data from the input. I've worked in this direction https://github.com/nbrr/conifer/tree/master/src today: I tried to abstract that points arrive by fragment. The loop seems simpler now. I removed the notion of touch being active/inactive and feed the callback with a point only when a full point has been read, but I expect the information to be retrieved in another form: I plan to give a similar implementation for swipes (a vec of points + a notion of the swipe being in progress or done). There we should be able to interpret a swip as a long tap (e.g. movement is limited and duration is long), drag (displacement vector between the first and last point of the swap). If we keep a few swipes in memory we can get double taps. What do you think of this approach? I'm sure there must be some existing crate implementing something similar but I don't really know what to look for.

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/richardanaya/conifer/issues/3#issuecomment-681140480, or unsubscribe https://github.com/notifications/unsubscribe-auth/AACHZGVEOZFTCZIJPFMA463SCV7BJANCNFSM4QMDBSKQ .

nbrr commented 4 years ago

Input events now stream into swipes. Added color in my test to try the difference between ongoing and finished swipe. This is still a bit messy, I think I need to settle on move/ref/mut. What do you think so far? Since we've got a swipe we cant implement a few gestures already.

Note that for some reason, the color order is wrong on my RPi. It seems to be BGR instead of the RGB you wrote in Frame. I don't know whether the order might changed depending on the frambuffer or it is a mistake.

richardanaya commented 4 years ago

Super awesome updates! I really like how you are building up points and using enums.

nbrr commented 4 years ago

From a single swipe we have so far a simple tap and a drag. What other interesting moves can we get from there?

richardanaya commented 4 years ago

I'd love for us to add the concept of swipe direction. Most of the swipes and taps and drags seem like some statistics on just a starting point and the following points up until the end.

In my head, I think the most challenging one we're going to have to think about is doubletap. Double tap seems unique in that depending on how fast two swipes are back to back, it might be an entirely different event.

I feel like certainly there must be some research out there on what timing feels good :)

Also, maybe a long hold event?

Anyhow, just some ideas to ponder!

nbrr commented 4 years ago

I'd love for us to add the concept of swipe direction. Most of the swipes and taps and drags seem like some statistics on just a starting point and the following points up until the end.

I feel like this is already what Drag is for now: the origin and latest point of the swipe (+duration).

Also, maybe a long hold event?

Tap already bears the duration it has been held. Long or short can be determined by the user once they get the event. I used a Point in the Tap, so I put the duration in the time field but this feels wrong as when we get a Point we don't know what kind of time it is supposed to bear.

In my head, I think the most challenging one we're going to have to think about is doubletap. Double tap seems unique in that depending on how fast two swipes are back to back, it might be an entirely different event.

I feel like certainly there must be some research out there on what timing feels good :)

Complex movements composed of more than one contact need some more work indeed! As for how long, I think that this can be something held in a parameter and given by the user, although a sensible default value can be suggested.