Closed OscartGiles closed 5 years ago
Hi Oscar! The surface tracker does currently take distortion into account only partially. Distortion is considered e.g. in the localisation of the surface corners (here in the code), but the mapping of the gaze point into the located surface does not compensate for the distortion. Therefore there can be an error in the gaze point in surface coordinates, if the surface is in a highly distorted region of the camera's FOV. We ran into a few problems when trying to compensate distortion also in the gaze mapping, but this is certainly still on our todo-list!
I started fixing this issue here #1268 . The current version of the PR should now correctly compensate for camera distortion when mapping gaze onto a surface, but it requires more testing. The current version of the surface tracking code is quite messy and I will take this opportunity to clean it up!
Hello! If I understand correctly this fix will enable to precisely track gaze on the flat surface (i.e.) screen. I tried to implement simple gaze tracking on the screen in following manner:
This workflow works currently pretty bad because there is an offset to my marker (depending on the part of the screen it can be pretty bad). My question is:
@PanBartosz Your workflow sounds correct. When you register the surface using the markers, do you edit the surface to make sure that the surface corners exactly match the display (the display panel rather than the full frame of the monitor!)? Besides that of course an accurate calibration is necessary.
The fix is going to improve the accuracy just very slightly if the surface is in a highly distorted part of the image. In most cases the improvement will not be noticeable.
Note that a small offset is normal. If your gaze accuracy is ~1-2 degrees then, depending on your distance to the display, you can calculate how large your offset is. An offset of up to 1-2 cm when sitting right in front of the display is what you can expect at 1-2 degrees gaze accuracy.
Wow, this is why we are loving having an open source eye tracker! Thanks for giving your time for this. 👍
Fixed with #1268
We're trying to figure out where someone is looking on a screen in pixel coordinates. To do this we are using the surface tracker which returns a 'norm_pos'.
Does this currently take account of the camera's fish eye distortion? I can't see any correction for it in the source code?
All the best, Oscar