Priority Order of Single Gaze Gestures in Eye Control System

  • Yating Zhang
  • Yafeng NiuEmail author
  • Chengqi Xue
  • Yi Xie
  • Bingzheng Shi
  • Bo Li
  • Lingcun Qiu
Conference paper
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 1026)


The eye-control system uses eye movements to achieve human-computer dialogue. This paper designs an ergonomic experiment to solve the problems of high complexity and low interaction efficiency of Complex Gaze Gesture (CGG) in the eye-control system. The experiment concludes the order of eye Single Gaze Gesture (SGG) priority by exploring the ergonomic differences between the eye movements. According to the conclusion of the priority order of SGG, it provides a scientific theoretical basis for the design of CGG.


Eye-control system Eye movements Single Gaze Gesture (SGG) Complex Gaze Gesture (CGG) Priority order 



This work was supported jointly by National Natural Science Foundation of China (No. 71801037, 71871056, 71471037), Science and Technology on Electro-optic Control Laboratory and Aerospace Science Foundation of China (No. 20165169017), SAST Foundation of China (SAST No. 2016010), Equipment Pre research & Ministry of education of China Joint fund, Fundamental Research Funds for the Central Universities of China (No. 2242019k1G023).


  1. 1.
    Chatzidaki, E., Xenos, M.: A case study on learning through natural ways of interaction. In: Global Engineering Education Conference. IEEE (2017)Google Scholar
  2. 2.
    Xuebai, Z., Xiaolong, L., Shyan-Ming, Y., et al.: Eye tracking based control system for natural human-computer interaction. Comput. Intel. Neurosci. 2017, 1–9 (2017)Google Scholar
  3. 3.
    Hyrskykari, A., Istance, H., Vickers., S.: Gaze gestures or dwell-based interaction? In: ETRA, pp. 229–232 (2012)Google Scholar
  4. 4.
    Dybdal, M.L., Agustin, J.S., Hansen, J.P.: Gaze input for mobile devices by dwell and gestures. In: ETRA, pp. 225–228 (2012)Google Scholar
  5. 5.
    Drewes, H., Schmidt, A.: Interacting with the computer using gaze gestures. In: IFIP TC 13 International Conference on Human-Computer Interaction (2007)Google Scholar
  6. 6.
    Rozado, D., Agustin, J.S., Rodriguez, F.B., Varona, P.: Gliding and saccadic gaze gesture recognition in real time. TIIS 1, 1–27 (2012)CrossRefGoogle Scholar
  7. 7.
    Chengzhi, F., Mowei, S.: Eye tracking technology and its application in human-computer interaction. J. Zhejiang Univ. 29, 225–232 (2002). (in chinese)Google Scholar
  8. 8.
  9. 9.
    Azizian, A., Freitas, A.L., Watson, T.D., et al.: Electro-physiological correlates of categorization: P300 amplitude as index of target similarity. Biol. Psychol. 71, 278–288 (2006)CrossRefGoogle Scholar
  10. 10.
    Møllenbach, E., Lillholm, M., Gail, A.G., Hansen, J.P.: Single gaze gestures. In: ETRA, pp. 177–180 (2010)Google Scholar
  11. 11.
    Seffah, A., Taleb, M.: Tracing the evolution of HCI patterns as an interaction design tool. Innov. Syst. Softw. Eng. 8(2), 93–109 (2012)CrossRefGoogle Scholar
  12. 12.
    Heikkilä, H., Räihä, K.J.: Speed and accuracy of gaze gestures. J. Eye. Mov. Res. 3, 1–14 (2010)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2020

Authors and Affiliations

  • Yating Zhang
    • 1
  • Yafeng Niu
    • 1
    Email author
  • Chengqi Xue
    • 1
  • Yi Xie
    • 2
  • Bingzheng Shi
    • 3
  • Bo Li
    • 2
  • Lingcun Qiu
    • 3
  1. 1.School of Mechanical EngineeringSoutheast UniversityNanjingChina
  2. 2.Science and Technology on Electro-Optic Control LaboratoryLuoyangChina
  3. 3.Shanghai Academy of Spaceflight TechnologyShanghaiChina

Personalised recommendations