Closed pjl1995 closed 3 years ago
Thank you. The tracker predicts a 2D location on a display. It is therefore hard to visualize it in a different context. You can maybe just draw a rectangle representing the phone and place circles based on the XY output of the tracker. It is possible that you are actually looking for a 3D gaze tracker. With that you can actually draw an arrow over the person's face. We have one such tracker here: http://gaze360.csail.mit.edu/ with a demo here: https://colab.research.google.com/drive/1AUvmhpHklM9BNt0Mn5DjSo3JRuqKkU4y
This work is great and interesting! I have run the pytorch code in Linux but I didn't find any visualization. Is there any code or APP in Windows/Linux/IOS/Android so that I can have an intuitive experience. Thanks!