virakri / eye-tracking-ios-prototype

MIT License
140 stars 35 forks source link

iPhone 6s doesn't work it #1

Closed pjcau closed 6 years ago

pjcau commented 6 years ago

Hi,

I'm curios for your example, and I clone it. I use obviously the iOS 12. I see this log

2018-06-13 10:37:48.224221+0200 Eyes Tracking[383:14693] [DYMTLInitPlatform] platform initialization successful 2018-06-13 10:37:48.297564+0200 Eyes Tracking[383:14672] Metal GPU Frame Capture Enabled 2018-06-13 10:37:48.298064+0200 Eyes Tracking[383:14672] Metal API Validation Enabled

and that 's ok. Maybe the problem is when declare the iPhone X dimension screen.

// actual physical size of iPhoneX screen let phoneScreenSize = CGSize(width: 0.0623908297, height: 0.135096943231532) // actual point size of iPhoneX screen let phoneScreenPointSize = CGSize(width: 375, height: 812)

I understand the phoneScreenPointSize variable, but I cannot calculate the phoneScreenSize. Can write algorithm, and so calculate for iPhone 6S?

J

ravichokshi commented 6 years ago

@pjcau : it will only work with iphone X.

pjcau commented 6 years ago

but there a limitation on API? Can you tell me which is it?

indevizible commented 6 years ago

@pjcau

Augmented Reality with the Front Camera On iPhone X, ARFaceTrackingConfiguration uses the front-facing TrueDepth camera to provide real-time information about the pose and expression of the user's face for you to use in rendering virtual content. For example, you might show the user's face in a camera view and provide realistic virtual masks. You can also omit the camera view and use ARKit facial expression data to animate virtual characters, as seen in the Animoji app for iMessage

https://developer.apple.com/documentation/arkit


so phoneScreenPointSize did calculate from actual screen size in meter. you also can calculate if you have a "point per inch" or "point per centimeter" from this extension then I think you know what to do next.

virakri commented 6 years ago

Hi @pjcau ,

Thank you for the comment and try out this prototype.

As I have made this prototype by utilizing face-based ARKit API to track position and orientation of face and eye-balls, unlike other open CVs face-tracking frameworks, this API requires devices that have TrueDepth camera to enable ARFaceTrackingConfiguration, which now (June 14, 2018) only iPhone X is equipped with that camera.

Further information about ARFaceTrackingConfiguration

I will be working on this repository's readme to give you all better transparency about this prototype.

Best, V

kavishjuneja11 commented 5 years ago

@virakri : How did you come up with : width: 0.0623908297, height: 0.135096943231532 ?

RolandColored commented 5 years ago

@virakri : How did you come up with : width: 0.0623908297, height: 0.135096943231532 ?

It's the physical screen size in meters.

Atinder1989 commented 5 years ago

eyePositionIndicatorView is showing on wrong position in iPad Pro iOS 12.
I am using this :- let phoneScreenSize = CGSize(width: UIScreen.main.bounds.width (0.0166/100), height: UIScreen.main.bounds.height(0.0166/100))

let phoneScreenPointSize = CGSize(width: UIScreen.main.bounds.width, height: UIScreen.main.bounds.height)

Expecting a good valuable help from your side.