Closed PatrikPluchino closed 7 years ago
No. You should convert from normalized to pixels using the size (width, height) of the world camera frame.
To clarify what @cpicanco means: the normalized coordinates are related to the world frame, not to the calibrated area.
Also, if you want the gaze position in regard to the screen of your smartphone, you have two options: detect the screen using a custom plugin or track the screen using markers and the marker tracker plugin. Then you will be able to export the screen of your mobile device as a "surface". You can configure the surface to match the mobile screen size in pixels.
Also, I am not sure if HMD_EYES was designed to mobile as well. (Would it be a third option??)
@papr - has summed it up. I would also like to chime in with a link to the docs on data format for others who read this issue who may have the same question.
And for fun, how about an ASCII diagram 😄
(0,1) (1,1)
+----world-frame---+
| |
| +---+ |
| | s | |
| +---+ |
+------------------+
(0,0) (1,0)
From the docs:
Origin 0,0 at the bottom left and 1,1 at the top right. This is the OpenGL convention and what we find to be an intuitive representation. This is the coordinate system we use most in Pupil. Vectors in this coordinate system are specified by a norm prefix or suffix in their variable name.
If you want to export gaze and fixation data relative to a surface (e.g. s
in the above ASCII diagram), then you need to be able to locate it within the space of the world frame.
One option (as @cpicanco mentioned) is to use physical fiducial markers on or around the phone screen. See docs here. In Pupil Player you can use the offline surface tracker and fixation detector plugin(s) and export gaze and fixation data relative to the surfaces in your scene. Surfaces also use a normalized coordinate system (0,0) bottom left and (1,1) top right.
@cpicanco - hmd-eyes is designed to be used with VR or AR headsets (or more generally HMDs).
This would not be an option unless the participant's head is immobilized and phone is also fixed in place (or phone mounted on your face like an HMD).
Ok, ok! Sorry about that. Would be great to have a solution for screen tracking and head-free head mounted pupil devices.
@PatrikPluchino - I will close this issue now, because it looks like questions have been answered. Please feel free to re-open this issue or add comments to this issue if you think otherwise 😄
Dear All, thank you very much for the information provided and for your availability.
Best Regards,
Patrik Pluchino
Dear All,
I have a small doubt regarding the data in the columns "norm_pos_x" and "norm_pos_y" inside the fixations.csv file that I have exported.
I assume that I have to calculate the proportion of these measures that are between 0 and 1 (e.g. 0.4056405) compared to the real size of the smartphone (that I have calibrate with the "manual marker calibration") that I am using as a screen to show an application.
Just to be sure my question is the following: Do the parameters 0 "norm_pos_x" and 0 "norm_pos_y" correspond to the top left angle of the smartphone screen while 1 "norm_pos_x" and 1 "norm_pos_y" correspond to the bottom right angle of the screen?
Best Regards,
Patrik Pluchino