TadasBaltrusaitis / OpenFace

OpenFace – a state-of-the art tool intended for facial landmark detection, head pose estimation, facial action unit recognition, and eye-gaze estimation.
Other
6.72k stars 1.83k forks source link

Eye gaze angles #834

Open LinaJunc opened 4 years ago

LinaJunc commented 4 years ago

Hello Tadas, we are very grateful that you published this tool!

We have two questions concerning the gaze angles. In our project we are analyzing the eye movement of participants relative to the centre of the screen in front of them to find out if there are certain visual patterns under different conditions. Our camera is installed on top of the screen and is therefore above the participants' eye level. The participants barely look directly into the camera. We are thinking that in our case the origin (0,0,0) should be the point the participants are fixating at the beginning of the task (=centre of the screen) instead of the camera. This way we could better compare the eye gaze between different participants. Otherwise the radians describing the same eye gaze relative to the point on the screen (e.g. top left) could differ between subjects due to the changing distance between camera and eyes which is depending on the participant's height. Do you know how to perform this shift? Do you have another idea to make the gaze more comparable?

Also, we noticed that the lines representing the eye gaze in the visualization video follow the eye lids downward when the participant is blinking even though in reality the gaze is staying. Is it normal to do that and does it affect the calculations of the eye gaze?

Thank you very much!

TadasBaltrusaitis commented 4 years ago

This is a fairly tricky problem, because you need to calibrate both for camera to the screen locations, but also for personal idiosyncrasies of each person you are tracking (as there will be an error bias for everyone).

There are certain ways you can tackle this:

  1. [Possibly most accurate] have a calibration stage for each participant that learns how to map from the predicted radians (maybe head pose as well) to the gaze location on the screen. There are numerous ways you can do it, some of the simplest is just training a regressor from OpenFace values to screen coordinates
  2. Calibrate only the screen to camera, using some "simple" trigonometry and knowing where the screen is relative to the camera and where the eyes of the participant are (you can get this from OpenFace) you can work out the screen location (this of course may be noisy if any of those estimates is incorrect).
  3. [easiest but probably not as accurate] Make sure the participant starts with looking at the centre of the screen to know what the centre is in radians.

Of course there are many more ways you can tackle this.

With regards to blink, yes the gaze goes down, you can filter it by looking at blink events from OpenFace.

LinaJunc commented 4 years ago

Thank you for those ideas!

Unfortunately, I am not sure what you mean by blink events. In the OpenFace output I couldn't find any values relating to that. Could you elaborate on your suggestion?

TadasBaltrusaitis commented 4 years ago

OpenFace outputs Action Units (https://github.com/TadasBaltrusaitis/OpenFace/wiki/Action-Units), one of them corresponds to a blink (au43 or au45, don't remember which one exactly)

LinaJunc commented 4 years ago

Perfect, thank you! (it is au45)