hysts / pytorch_mpiigaze

An unofficial PyTorch implementation of MPIIGaze and MPIIFaceGaze
MIT License
346 stars 86 forks source link

Can you tell me how to use the model? #6

Closed BlairLeng closed 4 years ago

BlairLeng commented 5 years ago

After I trained the model, how can I use that? Do I just capture the video and feed it to the model? I read your paper which said the image is required and the angle is also required. Would you like please tell me more about that? After I trained the model, what should I do?

Thank you very much!

WuZhuoran commented 5 years ago

+1

lucaskyle commented 5 years ago

plot ur result of your trained network

lucaskyle commented 5 years ago

actually they put head pose data into the model like this:

self.fc2 = nn.Linear(502, 2)

so when u want to pltt the reasult , u may put face images with face head pose into the network

hysts commented 4 years ago

Sorry for the late reply. Now that I've added the demo program, you can use it to visualize the gaze estimation results. See https://github.com/hysts/pytorch_mpiigaze#demo

This is also a stand-alone demo program. https://github.com/hysts/pytorch_mpiigaze_demo