auduno / headtrackr

Javascript library for headtracking via webcam and WebRTC/getUserMedia
3.69k stars 504 forks source link

On the robustness of the headtrackr #29

Open nicholaswmin opened 10 years ago

nicholaswmin commented 10 years ago

Hi,

I am trying to use the headtrackr for getting ONLY the X position of the head. I am using it in a small game I've build which looks pretty nice this far.

At the moment I ask my user to present his full face so the headtrackr can lock on his head and then I ask my user to tilt the screen a bit so that only the chin and up is visible(I want to hide the neck since low-cut tops create color interference).

I would like to ask 2 things:

What is the best advice you think I should give to my users in order to have my game as robust/accurate/fast as possible. What calibration recommendations do you suggest I should present and what are the ''perfect'' conditions for the headtrackr to work at it's best?

The goal is to make the head-tracking as robust as possible between different lighting environments. Also I need to have the head-position detection as predictable as humanly possible(some times the headtracker goes all nuts on me and starts swinging right and left, losing it's center).

At the moment I only advice that during the head tracking, the user should ensure that he has his both sides of his face evenly and brightly illuminated. Also as a second calibration step , I advice my user to tilt his laptop screen up until the point where the neck is not in the frame.

Second thing:

Of course any advice for any parameters I might pass on starting the headtrackr are welcome. (Should I use facetracking x position or headtracking x position?, Should I calculate angles etc etc)

Thanks in advance man and thanks for the work

auduno commented 10 years ago

Hi, glad you like headtrackr!

headtrackr relies a lot on facial colors to track the face, so to get good results, it's most important that light is even, and that there are no skin-colored objects in the background. I usually get best results if I face a window or another light source, so that there are no shadows on my face.

Regarding precision, have you had a look at clmtrackr which I've also made? It has much more precise facial tracking than headtrackr, but might be slower on some systems.

nicholaswmin commented 10 years ago

Yep I definetely did, but clm tracker is not even near to the tracking speed of the camshift algorithm.

My game relies on my players moving their heads rapidly right and left and clm tracker loses track in that case.

On the other hand - headtrackr keep up to speed just fine - but it loses it's ''focus'' somewhat easily.

Is there any possiblity of mixing the 2 algorithms? I am aware that anything other than camshift is not available in real time - but maybe some corrections on-the-fly by re-running the Viola-Jones at more regular intervals can be done to improve accuracy.

On 15 July 2014 16:33, Audun Mathias Øygard notifications@github.com wrote:

Hi, glad you like headtrackr!

headtrackr relies a lot on facial colors to track the face, so to get good results, it's most important that light is even, and that there are no skin-colored objects in the background. I usually get best results if I face a window or another light source, so that there are no shadows on my face.

Regarding precision, have you had a look at clmtrackr https://github.com/auduno/clmtrackr which I've also made? It has much more precise facial tracking than headtrackr, but might be slower on some systems.

— Reply to this email directly or view it on GitHub https://github.com/auduno/headtrackr/issues/29#issuecomment-49031492.

Nicholas Kyriakides Laser Plastics Industry LTD

This message and its attachments are private and confidential. If you have received this message in error, please notify the sender and remove it and its attachments from your system.

The University of Westminster is a charity and a company limited by guarantee. Registration number: 977818 England. Registered Office: 309 Regent Street, London W1B 2UW.

auduno commented 10 years ago

Yeah, it might help to stop and start headtrackr at regular intervals ( via stop() and start() ) if it loses focus often. I don't really know any other algorithms that are as fast as camshift but more precise, unfortunately.

nicholaswmin commented 10 years ago

I already do this, but I have no way of detecting automatically whether the focus was truly lost - Plus the delay in re-detecting the face is counter-intuitive.

Maybe a simple algorithm could be built using as input the events already emmited by headtrackr(head width/height rapidly changing might do) and then via a web-worker restart the Viola-Jones in the background and pass as a message the new detection to the original headtrackr thread.

I am no scientist nor too experienced to suggest something but I am trying to find ways to improve this -

I don't really think something other than camshift can be used - as you said yourself, I just have a hunch that there is room for improvement.

On 15 July 2014 16:51, Audun Mathias Øygard notifications@github.com wrote:

Yeah, it might help to stop and start headtrackr at regular intervals ( via stop() and start() ) if it loses focus often. I don't really know any other algorithms that are as fast as camshift but more precise, unfortunately.

— Reply to this email directly or view it on GitHub https://github.com/auduno/headtrackr/issues/29#issuecomment-49033933.

Nicholas Kyriakides Laser Plastics Industry LTD

This message and its attachments are private and confidential. If you have received this message in error, please notify the sender and remove it and its attachments from your system.

The University of Westminster is a charity and a company limited by guarantee. Registration number: 977818 England. Registered Office: 309 Regent Street, London W1B 2UW.

Neon22 commented 10 years ago

Side note - clmtracker seems to be using an old fork of numeric.js - which is now faster. http://numericjs.com/wordpress/?p=79

auduno commented 10 years ago

I actually thought about looking at the dimensions of the box in order to detect when it fails, but in my experiments this would restart the tracking too often when it shouldn't. If you run it in the background, though, the slowdown/lag on detection of face might not be an issue, so it's certainly worth a try.

Neon22 : where did you find an old version of numeric.js in clmtrackr? I thought I was using version 1.2.6 already, but might be I forgot to remove it somewhere.

Neon22 commented 10 years ago

actually I tyhinkits just the link to sloisel rather than spiroz https://github.com/spirozh/numeric. sorry for misdirect

nicholaswmin commented 10 years ago

In general there should be some telltale signs that the tracking has failed

Some time ago I remember seeing somewhere you telling that Hue/Saturation are not implemented, yet the camshift algorithm does take those into account.

Is this still the case?

On 16 July 2014 02:49, Neon22 notifications@github.com wrote:

actually I tyhinkits just the link to sloisel rather than spiroz https://github.com/spirozh/numeric. sorry for misdirect

— Reply to this email directly or view it on GitHub https://github.com/auduno/headtrackr/issues/29#issuecomment-49107772.

Nicholas Kyriakides Laser Plastics Industry LTD

This message and its attachments are private and confidential. If you have received this message in error, please notify the sender and remove it and its attachments from your system.

The University of Westminster is a charity and a company limited by guarantee. Registration number: 977818 England. Registered Office: 309 Regent Street, London W1B 2UW.

auduno commented 10 years ago

The original camshift paper mentions using only hue and saturation to track the face, but in my experiments I found that this didn't work as well as just using RGB, so I ended up just using RGB information to track the face. So hue and saturation is not implemented.

nicholaswmin commented 10 years ago

Very well,

If I get some off time I'll try to test what happens with the web-workers doing async face detection in the background and let you know how it went. I have in idea in mind by I doubt I'll find the time and interest to do it.

Until then thanks a lot for the info and the lib of course.

On 25 July 2014 19:09, Audun Mathias Øygard notifications@github.com wrote:

The original camshift paper mentions using only hue and saturation to track the face, but in my experiments I found that this didn't work as well as just using RGB, so I ended up just using RGB information to track the face. So hue and saturation is not implemented.

— Reply to this email directly or view it on GitHub https://github.com/auduno/headtrackr/issues/29#issuecomment-50170172.

Nicholas Kyriakides Laser Plastics Industry LTD

This message and its attachments are private and confidential. If you have received this message in error, please notify the sender and remove it and its attachments from your system.

The University of Westminster is a charity and a company limited by guarantee. Registration number: 977818 England. Registered Office: 309 Regent Street, London W1B 2UW.