Closed adswa closed 6 years ago
I have implemented a different thing for now. Any interval between two saccades is low-passed filtered (cutoff 20 Hz). That should kill any noise and leave only a smooth curve. For that curve, I compute the maximum amplitude between the start coordinate and any coordinate in the interval. If that amplitude exceeds a threshold of 0.7 deg, the interval is considered a pursuit event, otherwise it is a fixation.
That looks good for the data I have tried so far. The 0.7 is the calibration uncertainty for the in-scanner acquisition that we reported in the paper -- so not completely conjured up from thin air.
I set out to look for ideas on how to detect smooth pursuit. I found a paper by Larsson that describes a Matlab-based algorithm (also together with nyström): https://ac.els-cdn.com/S1746809414002031/1-s2.0-S1746809414002031-main.pdf?_tid=2f59fb52-3fa6-4579-ac3b-c590f3f10abe&acdnat=1535098835_0894ae76c22c47c2e79d3b67844b8f8f
I also found a paper by Agtzidis et al. (http://delivery.acm.org/10.1145/2860000/2857521/p303-agtzidis.pdf?ip=141.44.98.70&id=2857521&acc=ACTIVE%20SERVICE&key=2BA2C432AB83DA15%2E88D216EC9FFA262E%2E4D4702B0C3E38B35%2E4D4702B0C3E38B35&__acm__=1535102822_e5cd2e4bf831a77c5d3f6d152378af8a). Their algorithm uses data from multiple subjects to detect similar gaze patterns that are neither saccades nor fixations of subj. watching dynamic stimuli. If several people show movements that are neither saccades nor fixations the movements are seen as likely being pursuit. Their implementation is publicly available in Python here: michaeldorr.de/smoothpursuit/sp_tool.zip (I'm not sure whether their approach combining several subj. data after saccade and fixation detection could be easily integrated in the current way the algorithm is working)
Larsson developed an algorithm to classify fixations and smooth pursuit in eye tracking data when dynamic stimuli are used. A reimplementation of this algorithm in Matlab has been made publicly available by Agtzidis & Startsev here: michaeldorr.de/smoothpursuit/larsson_reimplementation.zip
The matlab code has the following steps:
The four parameters are compared to "individual thresholds" resulting in one criterion per parameter (I haven't yet understood where these thresholds derive from, in the implementation default values are given). If none of the criteria are satisfied, the segment is classed as fixation. If 1-3 criteria are satisfied segments are labeled as "uncertain". If all 4 criteria are satisfied, the segment is classed as smooth pursuit.
All segments in the "uncertain" category are evaluated again on criterion 3 (relating to positional displacement, "the most typical feature of a smooth pursuit movement compared to a fixation"). If satisfied, the spatial range is recalculated with by adding spatial ranges of other smooth pursuit segments in a intersaccadic interval that is comparable in regard to direction of the uncertain segment (based on a threshold phi). If range is larger than a threshold (default given in reimplementation), the segment is classified as smooth pursuit, otherwise as a fixation. If criterion 3 is not satisfied, criterion 4 is evaluated: if criterion 4 is not satisfied, the segment is classifies as a fixation