Shiru99 / AR-Eye-Tracker

Eye Tracking with ARKit : SwiftUI iOS App
https://youtu.be/cjR-mREOJCQ
MIT License
21 stars 2 forks source link

Asking for the a_ratio & b_ratio #1

Open zxcheergo opened 2 months ago

zxcheergo commented 2 months ago

Thanks for your wonderful project! I am curious about how to convert the physical size to the screen size. I noticed that you used some special ratio and the PPI should not be the same as iPhone XR. So could please provide more details about how you calculate these parameters? Thank you in advance !

Shiru99 commented 2 months ago

Pls refer this:

zxcheergo commented 2 months ago

Thanks for your reply! I have a further question. I noticed that you deal with the screenX with the following code:

let screenX = transformedLookAtPoint.y / (Float(Device.screenSize.width) / 2) * Float(Device.frameSize.width) Why the screenX is calculated from the lookAtPoint.y?

Plus, what's the origin of screenX and LookAtPoint in their corresponding coordinate? Why you use the half of the screenSize.width?

Shiru99 commented 2 months ago

Since the camera location is in the center of the screen (in portrait orientation), while considering width we need to consider only half of the width

For other questions, once I get free I will go through the code and add answers here soon

zxcheergo commented 2 months ago

Thanks for your reply! About half of the width, can I assume that the origin of camera coordinate should be at the camera? And the screen coordinate system's origin should be at the top left corner of the screen? Following this assumption, If we divide half of the screen width, should we also multiply half of the width of the framesize? Afterwards, we can compensate the screenX by adding the points from the origin of screen coordinate system to the camera? I tried to draw an illustration for this. I think we could calculate the screenX from the following equation: screenX = Fix Point X / (screen size width / 2) * (frame size width / 2) + compensation image