Shiru99 / AR-Eye-Tracker

Eye Tracking with ARKit : SwiftUI iOS App
https://youtu.be/cjR-mREOJCQ
MIT License
21 stars 2 forks source link
ar arfaceanchor arkit augmented-reality eye-gaze-prediction eye-tracking eyetracking

Eye Gaze Tracking with ARKit and ARFaceAnchor

This project demonstrates eye gaze tracking on a mobile screen using the front camera and ARKit's ARFaceAnchor feature. By utilizing ARFaceAnchor and the lookAtPoint property, we can accurately determine the user's eye gaze direction on their device's screen.

Prerequisites

Logic Explanation

Code utilizes the face anchor and camera transformations to convert the user's gaze direction from the local coordinate space to the world coordinate system. Then, by applying the inverse of the camera transformation, the gaze direction is mapped onto the screen plane. The resulting coordinates are scaled and clamped to obtain the focus point, indicating where the user is looking on their mobile screen.

limitations of eye tracking with ARKit

Acknowledgements

Contact

For any inquiries or suggestions, please feel free to reach out to Shiru99


YT Demo