-
Hi! I am trying to figure out if my participants are looking at the screen or not. For this, I will need to manually decide where the screen actually is (the camera may be above/below, etc.) which can…
-
Hello!
I have just started working with `openvr`. My system as follows:
* HTC Vive Pro Eye
* framework: NVidia OWL-OptiX-CUDA
* C++ with Visual Studio 2019
For my C++ application, I need th…
-
I tried to access the Panoptes web brower today, and found that it is being decommissioned.
I gazed upon its ruin, and I wept.
I gazed upon it some more, and I did wail, and beat my chest, and…
-
The landing position of the last saccade of each epoch is incorrect. This seems to be due to a bug in function `detecteyemovements.m` in line 360:
`if endsmp(end) > size(gazexy,2), endsmp(end) = si…
-
Gaze implementations is incorrect.
How it works?
* On Applied Gaze: Snapshot threat and position of the gazed person.
* On Gaze end: If threat is higher than the snapshot or position is futher …
Sinn updated
9 months ago
-
Listing datasets that contain eyetracking data that could either be used as examples for the BEP or as input test data for an eventual eyetracking converter.
- Sleep study: https://openneuro.org/da…
-
We built a 3D scene in Unity and then used the HTC VIVE pro eye, a virtual reality headset device, to obtain eye movement data as follows.
![QQ图片20230405144401](https://user-images.githubusercontent.…
-
Thank you for the great work!
I have two questions about the data.
1. From both "fixation.csv" and "gaze.csv", there are columns named "xmin_shown_from_image / ymin_shown_from_image / xmax_shown_f…
-
Hi! As FAQ page is down, I hope it is ok that I ask a question here. I want to train my own model but dont have 10 calibration coordinates with fMRI scans, we unfortunately only started capturing fMRI…
-
Hi Tadas, thank you for developing such a important and useful toolkit.
I have a few questions about what exactly the "world coordinates" OpenFace outputs in are, and how to modify this to gaze rel…