1j01 / tracky-mouse

Mouse control via head tracking, as a cross platform desktop app and JS library. eViacam alternative.
https://trackymouse.js.org/
MIT License
25 stars 4 forks source link

Fire click action with blinking #1

Open wederfabricio opened 3 years ago

wederfabricio commented 3 years ago

Hello @1j01 !

I was playing with your demo available in: https://1j01.github.io/tracky-mouse/

Congratulations for your studies in this topic, and for group these knowledge.

Is there any way to click with eyes or head movements?

1j01 commented 3 years ago

I have implemented dwell clicking already (click by hovering in one spot), although it's not released in that demo yet. Other clicking options like blink detection are planned, but I don't have any timeline for this project. Unfortunately this project is hard for me to work on due to serious health issues I have with my neck.

wederfabricio commented 3 years ago

I'm sorry for your health issues.

I would like to implement this functionality, to click by blink. If you give more information like a lib to blink detection, I can implement this and submit a PR to your approval.

What do you think about this?

1j01 commented 2 years ago

I don't think there are any good blink detection libraries out there, not with accurate detection. I think I would try using Teachable Machine, with webcam images cropped around the head region (don't want to be too narrow in case the head tracking is laggy or inaccurate), possibly normalized in other ways. I had some good initial success with it, with their web application where you can train directly from your webcam. I got it to detect me blinking pretty well in under five minutes. Then I discovered cases where it didn't work, like moving significantly away from my starting head position. I tried training it more deliberately, giving it samples with different head positions and facial expressions. I don't think it got that much better...? But I think cropping could help a lot, so it focuses on the important parts (the eyes) and less on irrelevant background details. If given a proper dataset of lots of open and closed eyes, it could be immediately usable for most people, and then the really interesting part is the user could be allowed to teach it new examples at runtime, based on their specific conditions (e.g. wearing novelty glasses, having a rare eye disease, rare eye color, and weird lighting all at once? it could happen! even while Black)

wederfabricio commented 2 years ago

Awesome! I'm finishing some college projects now, I plan to resume this soon.

Thank you so much, you are making excellent progress.

1j01 commented 1 year ago

An update: I've integrated dwell clicking into the desktop app (previously it was only part of the library). I'm narrowing this issue down to focus on eye blink clicking support, and I'm opening an issue for mouth gestures: https://github.com/1j01/tracky-mouse/issues/25

1j01 commented 2 months ago

The demo here looks like it tracks blinking well, so we may be able to update FaceMesh and detect blinks easily.

One minor detail: sometimes a fully closed eye isn't detected as fully closed, and an eye can be open and detected at a similar squinty level, however, if one eye is detected as fully closed, and the other eye is at that squinty level, I think it can be assumed that the squinty eye is open, and otherwise, if neither eye is detected as fully closed, then a squinty level can be assumed to be closed. So it might make sense to bias the blink detection, taking into account both eyes. (When you blink one eye, you naturally squint with the other a bit, but not necessarily as much as the model reports. I think this physical phenomenon may have biased the model since eye blinking and opposite eye squinting are correlated.)