Open MinjingLin opened 5 years ago
The ELG_i6036_f6036_n32_m2
is trained on UnityEyes Dataset. If you want to evaluate on MPIIGaze dataset, you can check cross dataset evaluation
section in the paper.
Also, estimate_gaze_from_landmarks
is not used for inference. maybe you can try this gaze to calculate the real gaze.
@MinjingLin Have you solved the problem? I want to test the model for gaze estimation too but can't find the code. Besides, I guess the # angular error # used in the paper is # mean angular error # which should be 8.8 / 2 = 4.4? I'm not sure about it.
Update, I found how to compute the angular error (gaze estimation error) in this paper: He, Z., Spurr, A., Zhang, X., & Hilliges, O. (n.d.). Photo-Realistic Monocular Gaze Redirection Using Generative Adversarial Networks. 6932–6941. (Page 5)
Hello, I use the ELG_i6036_f6036_n32_m2 model to get landmarks, method estimate_gaze_from_landmarks in models.elg to get gaze angle. Then I test in MPIIGaze data selected from Evaluation Set, ground truth and eye images are in normalized space. I got a mean angular error(degrees) 9.56. It seem that model-based method performance not so good in MPIIGaze data. Does the methods in code for gaze estimation only has Model-based Estimation? I can't find the model about SVR trained in MPIIGaze. Maybe I need to train a SVR model in MPIIGaze data that will performance better than model-based? Does you manually annotated eyelid and iris landmarks in MPIIGaze to train the SVR?
In case I made a wrong evaluation. I post one MPIIGaze normalized sample tested used gazeML, for pitch angle ,up is positive, for yaw angle, left is positive.
img_name# p00/day14/0326.jpg
eye_side#: left
3D gaze_vector#: -0.01114 0.25135 0.9678
true gaze angle(degrees) [pitch,yaw]#: [-14.5578 0.6596]
gazeML pre gaze angle(degrees)[pitch,yaw]#: [-5.7414 0.9097]
angular error(degrees)#: 8.8198