Closed Zeleni9 closed 4 years ago
Hi @Zeleni9
Have a look at the 3d gaze data. You can use the gaze_point_3d
values and normalize to get a cyclopean 3d gaze unit vector originating in the origin of the scene camera coordinate system. Alternatively, you can use the gaze_normal0/1
gaze vectors that originate in eye_center0/1
(scene camera coordinate system).
These will not be immediately comparable as these values are all relative to the scene camera. I suggest using the head pose tracker feature to track the scene camera within the room and map the scene-cam-relative gaze into the room. The goal would be to build a common coordinate system in which you can compare the output of both eye-tracking systems.
@Zeleni9 when comparing with a remote coordinate system, you might also try using the marker-based surface tracker to define a surface on your remote setup. Pupil will give you transformation matrices between camera coordinates and surface coordinates that might help with comparing directions.
Thank you @pfaion and @papr for fast respone. I will see what is a better approach then. Cheers.
Hi,
is it possible to get the gaze angle or gaze unit vector 3D ? The idea being to compare the output of the remote gaze estimation output (giving the gaze angles which are then converted to gaze unit 3D vector) with Pupil Labs gaze angle/vector as ground truth. We have remote gaze model which gives out the gaze angles similar to https://ait.ethz.ch/projects/2018/landmarks-gaze/.
Thanks.