Open ritko opened 4 years ago
This would be super cool, except after installing everything and trying to run the example file, it returned an "Abort trap: 6" error.
It would also be nice to be able to do a calibration via webcam and then do gaze detection from a different video source.
Or even just do screen coordinates without calibration. That would be ideal. You might check this out: https://stackoverflow.com/a/52963879/11792607 (I'm a noob and don't know how to implement this)
Added mapping of gaze ratios to screen coordinates, so that it now tracks eye point of gaze (EPOG) on the screen.
This is done by asking the user to fixate on calibration points with predetermined coordinates. These points, plus knowledge of screen size, are used to map subsequent gaze ratios to screen coordinates. Subsequently, a short test is run, where user fixates on test points, and the EPOG error is calculated.
Mainly added gaze_calibration.py file and completely redid the previous example.py file