elucideye / drishti

Real time eye tracking for embedded and mobile devices.
BSD 3-Clause "New" or "Revised" License
392 stars 82 forks source link

How to set doEyeFlow parameter? How to set eye tracker more robust? #743

Open universewill opened 5 years ago

universewill commented 5 years ago

I notice that there is a doEyeFlow flag parameter. How to set it in hci.cpp? Will this parameter make eye prediction more robust?

I use a video to do eyetraking whit hci.cpp demo. However, i notice that the result is wrong for some frames (for example, the algrithm detect wrong eye positions). How to set eye tracker more robust?

Besides, i can tolerate frame with no detection results, but can not tolerate wrong eye detections. So how to set parameters to make result more tolerable?

headupinclouds commented 5 years ago

I notice that there is a doEyeFlow flag parameter

That won't help. It was for relative motion estimates.

However, i notice that the result is wrong for some frames (for example, the algorithm detect wrong eye positions). How to set eye tracker more robust?

The pipeline basically runs:

face detection -> low resolution refinement -> eye model fitting

You can start by eliminating false positives in the detection stage.

The ACF gradient boosting models are extremely fast for mobile use case, but are limited compared to SOTA CNN models, and can require more tuning. This will probably be updated in the future, or used to enumerate detection hypothesis and paired with a fast/shallow CNN.

There are some simple things you can try.

1) Tuning (reducing) the detection search volume to only allow plausible solutions of appropriate scale for your use case is the first step. I.e., try to set the max distance such that you avoid spending lots of time searching for tiny faces that are the size (in pixels) of an eye or nose in your typical true positive cases.

See: https://github.com/elucideye/drishti/blob/c88f05991bc41bfc25017ada5f82d1cff945a3ea/src/examples/facefilter/lib/facefilter/renderer/Renderer.cpp#L108

2) The ACF detector has a nice formulation with a calibration term that can allow you to set your classification/detection operating point at runtime with existing models. You can try decreasing this in steps with a 0.75x multiplier or similar until you get rid of false positives.

See: https://github.com/elucideye/drishti/blob/c88f05991bc41bfc25017ada5f82d1cff945a3ea/src/examples/facefilter/lib/facefilter/renderer/Renderer.cpp#L110

These params could be exposed to the top level app for tuning.