Advertisement

Towards Pupil-Assisted Target Selection in Natural Settings: Introducing an On-Screen Keyboard

  • Christoph StrauchEmail author
  • Lukas Greiter
  • Anke Huckauf
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10515)

Abstract

Preliminary reports have shown the possibility to assist input commands in HCI via pupil dilation. Applicability of these findings is however subject to further investigations, since the specificity of changes in diameter is low, e.g. through variations in brightness. Investigating employability and shape of pupil size dynamics outside a strictly controllccced laboratory, we implemented the emulation of selection via an integrated mechanism of pupil dilation and constriction that could speed up a dwell time of 1.5 s. During the operation of an on-screen keyboard, 21 subjects were able to type via this mechanism, needing 1 s on average per keystroke and producing only slightly more than 1% false positive selections. Hereby, pupil dynamics were assessed. More than 90% of keystrokes could be accelerated under assistance of pupil variations. As suggested from basic research, pupil dilated when fixating later selected keys and constricted shortly afterwards. This finding was consistent between all subjects, however, pupil dynamics were shifted in regard to temporal occurrence and amplitude of diameter changes. Pupil-Assisted Target Selection shows potential in non-strictly controlled environments for computer input and may be further improved on the basis of this data. This might culminate in an integrated gaze-based object selection mechanism that could go beyond the benchmarking dwell time performance.

Keywords

Eye typing Gaze-based interaction Physiological computing 

References

  1. 1.
    Strauch, C., Ehlers, J., Huckauf, A.: Pupil-assisted target selection (PATS). In: Bernhaupt, R., et al. (eds.) INTERACT 2017, Part III. LNCS, vol. 10515, pp. 297–312. Springer, Cham (2017)Google Scholar
  2. 2.
    Mathôt, S., Melmi, J.-B., van der Linden, L., van der Stigchel, S.: The mind-writing pupil: a human-computer interface based on decoding of covert attention through pupillometry. PLoS ONE 11(2), e0148805 (2016)CrossRefGoogle Scholar
  3. 3.
    Ehlers, J., Strauch, C., Huckauf, A.: A view to a click: pupil size changes as input command in eyes-only human-computer interaction. Int. J. Hum.-Comput. Stud. (submitted)Google Scholar
  4. 4.
    Dalmaijer, E.: Is the low-cost eyetribe eye tracker any good for research? PeerJ PrePrints, January 2014Google Scholar
  5. 5.
    Hess, E.H., Polt, J.M.: Pupil size in relation to mental activity during simple problem-solving. Science 143(3611), 1190–1192 (1964)CrossRefGoogle Scholar
  6. 6.
    Ehlers, J., Strauch, C., Georgi, J., Huckauf, A.: Pupil size changes as an active information channel for biofeedback applications. Appl. Psychophysiol. Biofeedback 42, 1–9 (2016)Google Scholar
  7. 7.
    Privitera, C.M., Renninger, L.W., Carney, T., Klein, S., Aguilar, M.: Pupil dilation during visual target detection. J. Vis. 10(10), 3 (2010)CrossRefGoogle Scholar
  8. 8.
    Einhäuser, W., Koch, C., Carter, O.: Pupil dilation betrays the timing of decisions. Front. Hum. Neurosci. 4, 18 (2010)Google Scholar
  9. 9.
    Wierda, S.M., van Rijn, H., Taatgen, N.A., Martens, S.: Pupil dilation deconvolution reveals the dynamics of attention at high temporal resolution. Proc. Nat. Acad. Sci. 109(22), 8456–8460 (2012)CrossRefGoogle Scholar
  10. 10.
    Gilzenrat, M.S., Nieuwenhuis, S., Jepma, M., Cohen, J.D.: Pupil diameter tracks changes in control state predicted by the adaptive gain theory of locus coeruleus function. Cogn. Affect. Behav. Neurosci. 10(2), 252–269 (2010)CrossRefGoogle Scholar
  11. 11.
    de Gee, J.W., Knapen, T., Donner, T.H.: Decision-related pupil dilation reflects upcoming choice and individual bias. Proc. Nat. Acad. Sci. 111(5), E618–E625 (2014)CrossRefGoogle Scholar
  12. 12.
    Low, T., Bubalo, N., Gossen, T., Kotzyba, M., Brechmann, A., Huckauf, A., Nürnberger, A.: Towards identifying user intentions in exploratory search using gaze and pupil tracking. In: Proceedings of the 2017 Conference on Human Information Interaction and Retrieva. ACM (2017)Google Scholar
  13. 13.
    Klingner, J.: Fixation-aligned pupillary response averaging. In: Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications, pp. 275–282. ACM (2010)Google Scholar
  14. 14.
    Bednarik, R., Vrzakova, H., Hradis, M.: What do you want to do next: a novel approach for intent prediction in gaze-based interaction. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 83–90. ACM (2012)Google Scholar
  15. 15.
    Urai, A.E., Braun, A., Donner, T.H.: Pupil-linked arousal is driven by decision uncertainty and alters serial choice bias. Nat. Commun. 8 (2017)Google Scholar
  16. 16.
    Ekman, I., Poikola, A.W., Mäkäräinen, M.K.: Invisible eni: using gaze and pupil size to control a game. In: Proceedings of the 2008 CHI Conference on Human Factors in Computing Systems. ACM (2008)Google Scholar
  17. 17.
    Stoll, J., et al.: Pupil responses allow communication in locked-in syndrome patients. Curr. Biol. 23(15), R647–R648 (2013)CrossRefGoogle Scholar
  18. 18.
    Majaranta, P.: Text Entry by Eye Gaze. Tampereen yliopisto, Tampere (2009)Google Scholar
  19. 19.
    Jacob, R.J.K.: The use of eye movements in human-computer interaction techniques: what you look at is what you get. ACM Trans. Inf. Syst. (TOIS) 9(2), 152–169 (1991)CrossRefGoogle Scholar
  20. 20.
    Vidal, M., Bulling, A., Gellersen, H.: Pursuits: spontaneous interaction with displays based on smooth pursuit eye movement and moving targets. In: Proceedings of the 2013 ACM International Joint Conference on Pervasive and Ubiquitous Computing. ACM (2013)Google Scholar
  21. 21.
    Soukoreff, R.W., MacKenzie, I.S.: Metrics for text entry research: an evaluation of MSD and KSPC, and a new unified error metric. In: Proceedings of the 2003 CHI Conference on Human Factors in Computing Systems (2003)Google Scholar
  22. 22.
    Peirce, J.W.: PsychoPy—psychophysics software in Python. J. Neurosci. Methods 162(1), 8–13 (2007)CrossRefGoogle Scholar
  23. 23.
    Brooke, J.: SUS-A quick and dirty usability scale. Usability Eval. Ind. 189(194), 4–7 (1996)Google Scholar
  24. 24.
    Partala, T., Surakka, V.: Pupil size variation as an indication of affective processing. Int. J. Hum.-Comput. Stud. 59(1), 185–198 (2003)CrossRefGoogle Scholar
  25. 25.
    Majaranta, P., Räihä, K.-J.: Text entry by gaze: utilizing eye-tracking. In: Text Entry Systems: Mobility, Accessibility, Universality, pp. 175–187 (2007)Google Scholar
  26. 26.
    Majaranta, P., Ahola, U.K., Spakov, O.: Fast eye-typing with an adjustable dwell time. In: Proceedings of the 2009 CHI Conference on Human Factors in Computing Systems. ACM (2009)Google Scholar
  27. 27.
    Lewis, J.R., Sauro, J.: The factor structure of the system usability scale. In: Kurosu, M. (ed.) HCD 2009. LNCS, vol. 5619, pp. 94–103. Springer, Heidelberg (2009). doi: 10.1007/978-3-642-02806-9_12 CrossRefGoogle Scholar
  28. 28.
    Räihä, K.-J.: Life in the fast lane: effect of language and calibration accuracy on the speed of text entry by gaze. In: Abascal, J., Barbosa, S., Fetter, M., Gross, T., Palanque, P., Winckler, M. (eds.) INTERACT 2015. LNCS, vol. 9296, pp. 402–417. Springer, Cham (2015). doi: 10.1007/978-3-319-22701-6_30 CrossRefGoogle Scholar
  29. 29.
    Wang, C., Munoz, D.P.: Modulation of stimulus contrast on the human pupil orienting response. Eur. J. Neurosci. 40(5), 2822–2832 (2014)CrossRefGoogle Scholar
  30. 30.
    Mathôt, S., Melmi, J.-B., Castet, E.: Intrasaccadic perception triggers pupillary constriction. PeerJ 3, e1150 (2015)CrossRefGoogle Scholar

Copyright information

© IFIP International Federation for Information Processing 2017

Authors and Affiliations

  • Christoph Strauch
    • 1
    Email author
  • Lukas Greiter
    • 1
  • Anke Huckauf
    • 1
  1. 1.Department of General PsychologyUlm UniversityUlmGermany

Personalised recommendations