-
If a Ctrl+F is pressed, we need to detect what part of the code editor is occluded. These events should be written out to the gaze output so that we can use this information during post processing. We…
-
We need to write our GitBook for the physiological data. In this deliverable, we are to have a step by step guide on these items:
* What is the system.
* What is it for.
* What are the dependencie…
-
### Original issue opened by:
@camnewnham
___
## Describe the bug
When the app has gaze permission enabled but the user is uncalibrated (declines prompt to calibrate) the Gaze controller remains …
-
Main Tasks:
- [x] Read basic emotions
- [x] Basic eye tracking
- [x] Determine whether or not user is looking at screen from eye tracking data
- [ ] Add new emotion data sets for confusion, bor…
-
Thank you for your Open Source contribution. We are addressing a similar solution, and would like to ask how to get the license? Thank you.
-
- [ ] **Pangram viewer:** View pangrams in selected fonts. Possibly identify characters that aren't included in the font's glyphset. [reddit thread](https://www.reddit.com/r/typography/comments/4r9h7p…
-
For people on google cardboard, or other viewers that want to force-enable stereoscopic.
-
![image001](https://user-images.githubusercontent.com/1855315/96175897-6717ac80-0ee0-11eb-90c0-1aa6bcfc1443.png)
Every video created with the MVR software has tracking information in the first frame (…
-
**Is your feature request related to a problem? Please describe.**
Eye contact and other non verbal language is very important to hold smooth conversations. Spatial audio helps some non verbal queues …
-
Would it be possible to commit your model eye_model.hd5 and synsets.pkl and the Dataset that you used. I would like to reproduce what you have done for a project.
Many thanks