An adaptive algorithm for fast and reliable online saccade detection
To investigate visual perception around the time of eye movements, vision scientists manipulate stimuli contingent upon the onset of a saccade. For these experimental paradigms, timing is especially crucial, because saccade offset imposes a deadline on the display change. Although efficient online saccade detection can greatly improve timing, most algorithms rely on spatial-boundary techniques or absolute-velocity thresholds, which both suffer from weaknesses: late detections and false alarms, respectively. We propose an adaptive, velocity-based algorithm for online saccade detection that surpasses both standard techniques in speed and accuracy and allows the user to freely define the detection criteria. Inspired by the Engbert–Kliegl algorithm for microsaccade detection, our algorithm computes two-dimensional velocity thresholds from variance in the preceding fixation samples, while compensating for noisy or missing data samples. An optional direction criterion limits detection to the instructed saccade direction, further increasing robustness. We validated the algorithm by simulating its performance on a large saccade dataset and found that high detection accuracy (false-alarm rates of < 1%) could be achieved with detection latencies of only 3 ms. High accuracy was maintained even under simulated high-noise conditions. To demonstrate that purely intrasaccadic presentations are technically feasible, we devised an experimental test in which a Gabor patch drifted at saccadic peak velocities. Whereas this stimulus was invisible when presented during fixation, observers reliably detected it during saccades. Photodiode measurements verified that—including all system delays—the stimuli were physically displayed on average 20 ms after saccade onset. Thus, the proposed algorithm provides a valuable tool for gaze-contingent paradigms.
KeywordsSaccade detection Eye movements Intrasaccadic perception Gaze-contingent presentation
We acknowledge the significant contributions of Ralf Engbert, Konstantin Mergenthaler, Petra Sinn, and Hans Trukenbrod for making the code of their microsaccade detection toolbox publicly available, as well as Nicolas Devillard for the excellent ANSI C implementations and comparisons of different median search algorithms (http://ndevilla.free.fr/median/median/index.html). R.S. was supported by the Studienstiftung des deutschen Volkes and the Berlin School of Mind and Brain. M.R. was supported by the Deutsche Forschungsgemeinschaft (DFG, grants RO3579/2-1, RO3579/8-1, and RO3579/10-1).
Implementations of the proposed algorithm in C, Python, and Matlab are available on Github: https://github.com/richardschweitzer/OnlineSaccadeDetection.
Open practices statement
The saccade data and code used for simulations, data collected throughout the experimental test, experimental code, and data analysis scripts are available on the Open Science Framework: https://osf.io/3pck5/. The experimental test was not preregistered.
R.S. implemented the algorithm and ran simulations. Validation procedure was conceptualized by R.S. and M.R. The experimental test was designed, run, and analyzed by R.S. under M.R.’s supervision. R.S. drafted the manuscript, and M.R. provided critical revisions.
- Castet, E. (2010). Perception of intra-saccadic motion. In U. J. Ilg & G. S. Masson (Eds.), Dynamics of visual motion processing (pp. 213–238). Berlin, Germany: Springer.Google Scholar
- Engbert, R., Rothkegel, L., Backhaus, D., & Trukenbrod, H. A. (2016). Evaluation of velocity-based saccade detection in the smi-etg 2W system [Technical Report]. Retrieved from http://read.psych.uni-potsdam.de/attachments/article/156/TechRep-16-1-Engbert.pdf
- Hollingworth, A., Richard, A. M., & Luck, S. J. (2008). Understanding the function of visual short-term memory: Transsaccadic memory, object correspondence, and gaze correction. Journal of Experimental Psychology. General, 137, 163–181. doi: https://doi.org/10.1037/0096-34126.96.36.199 CrossRefPubMedPubMedCentralGoogle Scholar
- Kleiner, M., Brainard, D., & Pelli, D. (2007). What’s new in Psychtoolbox-3? Perception, 36(ECVP Abstract Suppl), 14.Google Scholar
- Panouillères, M. T., Gaveau, V., Debatisse, J., Jacquin, P., LeBlond, M., & Pélisson, D. (2016). Oculomotor adaptation elicited by intra-saccadic visual stimulation: Time-course of efficient visual target perturbation. Frontiers in Human Neuroscience, 10, 91. doi: https://doi.org/10.3389/fnhum.2016.00091 CrossRefPubMedPubMedCentralGoogle Scholar
- Press, W. H., Teukolsky, S. A., Vetterling, W. T., & Flannery, B. P. (2007). Numerical recipes: The art of scientific computing (3rd ed.). Cambridge, UK: Cambridge University Press.Google Scholar
- SR Research. (2005). EyeLink II user manual, version 2.14. Mississauga, ON: SR Research Ltd.Google Scholar
- SR Research. (2010). EyeLink 1000 user manual, version 1.5.2. Mississauga, ON: SR Research Ltd.Google Scholar
- SR Research. (2013). EyeLink 1000 plus user manual, version 1.0.12. Mississauga, ON: SR Research Ltd.Google Scholar
- Tobii Technology AB. (2010). Timing guide for Tobii eye trackers and eye tracking software [Technical Report]. Retrieved from https://www.tobiipro.com/siteassets/tobii-pro/learn-and-support/design/eye-tracker-timing-performance/tobii-eye-tracking-timing.pdf
- Townsend, J. T., & Ashby, F. G. (1978). Methods of modeling capacity in simple processing systems. In J. N. J. Castellan & F. Restle (Eds.), Cognitive theory (Vol. 3, pp. 199–239). New York, NY: Erlbaum.Google Scholar
- Townsend, J. T., & Ashby, F. G. (1983). Stochastic modeling of elementary psychological processes. Cambridge University Press Archive.Google Scholar
- VPixx Technologies. (2017). TRACKPIXX3 hardware manual version 1.0. Saint-Bruno, QC: VPixx Technologies Inc.Google Scholar