pupil-labs / pupil

Open source eye tracking
https://pupil-labs.com
GNU Lesser General Public License v3.0
1.49k stars 679 forks source link

Does surface tracker take account of camera distortion? #1260

Closed OscartGiles closed 5 years ago

OscartGiles commented 6 years ago

We're trying to figure out where someone is looking on a screen in pixel coordinates. To do this we are using the surface tracker which returns a 'norm_pos'.

Does this currently take account of the camera's fish eye distortion? I can't see any correction for it in the source code?

All the best, Oscar

marc-tonsen commented 6 years ago

Hi Oscar! The surface tracker does currently take distortion into account only partially. Distortion is considered e.g. in the localisation of the surface corners (here in the code), but the mapping of the gaze point into the located surface does not compensate for the distortion. Therefore there can be an error in the gaze point in surface coordinates, if the surface is in a highly distorted region of the camera's FOV. We ran into a few problems when trying to compensate distortion also in the gaze mapping, but this is certainly still on our todo-list!

marc-tonsen commented 6 years ago

I started fixing this issue here #1268 . The current version of the PR should now correctly compensate for camera distortion when mapping gaze onto a surface, but it requires more testing. The current version of the surface tracking code is quite messy and I will take this opportunity to clean it up!

PanBartosz commented 6 years ago

Hello! If I understand correctly this fix will enable to precisely track gaze on the flat surface (i.e.) screen. I tried to implement simple gaze tracking on the screen in following manner:

  1. Register surface using markers in pupil capture
  2. Fetch the data (norm_position) of gaze and fixation from pupil eye tracker
  3. Map these values to the screen and display marker

This workflow works currently pretty bad because there is an offset to my marker (depending on the part of the screen it can be pretty bad). My question is:

  1. Is your fix going to help?
  2. Are there other techniques for mapping gaze to the screen which can be used by pupil?
  3. If the answer to 1) is positive, can I help to test it somehow?
marc-tonsen commented 6 years ago

@PanBartosz Your workflow sounds correct. When you register the surface using the markers, do you edit the surface to make sure that the surface corners exactly match the display (the display panel rather than the full frame of the monitor!)? Besides that of course an accurate calibration is necessary.

The fix is going to improve the accuracy just very slightly if the surface is in a highly distorted part of the image. In most cases the improvement will not be noticeable.

Note that a small offset is normal. If your gaze accuracy is ~1-2 degrees then, depending on your distance to the display, you can calculate how large your offset is. An offset of up to 1-2 cm when sitting right in front of the display is what you can expect at 1-2 degrees gaze accuracy.

OscartGiles commented 6 years ago

Wow, this is why we are loving having an open source eye tracker! Thanks for giving your time for this. 👍

papr commented 5 years ago

Fixed with #1268