Advanced eye-gaze input system with two types of voluntary blinks

  • Hironobu SatoEmail author
  • Kiyohiko Abe
  • Shogo Matsuno
  • Minoru Ohyama
Original Article


Recently, several eye-gaze input systems have been developed. Some of these systems consider the eye-blinking action to be additional input information. A main purpose of eye-gaze input systems is to serve as a communication aid for the severely disabled. The input system, which employs eye blinks as command inputs, needs to identify voluntary (conscious) blinks. In the past, we developed an eye-gaze input system for the creation of Japanese text. Our previous system employed an indicator selection method for command inputs. This system was able to identify two types of voluntary blinks. These two types of voluntary blinks work as functions governing indicator selection and error correction, respectively. In the evaluation experiment of the previous system, errors were occasionally observed in the estimation of the number of indicators at which the user was gazing. In this study, we propose a new input system that employs a selection method based on a novel indicator estimation algorithm. We conducted an experiment to evaluate the performance of Japanese text creation using our new input system. This study reports that using our new input system improves the speed of text input. In addition, we demonstrate a comparison of the various related eye-gaze input systems.


Eye blink input Eye-gaze input Image analysis Input interface Voluntary blink 


  1. 1.
    Esaki S, Ebisawa Y, Sugioka A, Konishi M (1997) Quick menu selection using eye blink for eye-slaved nonverbal communicator with video-based eye-gaze detection. In: EMBS, proceedings of the 19th annual international conference of the IEEE, vol 5, pp 2322–2325Google Scholar
  2. 2.
    Kajiwara Y, Murata Y, Kimura H, Abe K (2012) Human–computer interface controlled by horizontal directional eye movements and voluntary blinks using AC EOG signals (in Japanese). IEEJ Trans EIS 132(4):555–560CrossRefGoogle Scholar
  3. 3.
    Hori J, Sakano K, Saitoh Y (2006) Development of a communication support device controlled by eye movements and voluntary eye blink. IEICE Trans Inf Syst 89(6):1790–1797CrossRefGoogle Scholar
  4. 4.
    Arai K, Uwetaki H (2007) Computer input system based on viewing vector estimation with iris center detection from the image of users acquired with relatively cheap web camera allowing user movements (in Japanese). IEEJ Trans EIS 127(7):1107–1114CrossRefGoogle Scholar
  5. 5.
    Arai K, Yajima K (2008) Communication aid with human eyes only (in Japanese). IEEJ Trans EIS 128(11):1679–1686CrossRefGoogle Scholar
  6. 6.
    Chen BC, Wu PC, Chien SY (2015) Real-time eye localization, blink detection, and gaze estimation system without infrared illumination. In: IEEE international conference on image processing (ICIP), pp 715–719Google Scholar
  7. 7.
    Huchinson TE, White KP, Martin WN, Reicher KC, Frey LA (1989) Human–computer interaction using eye-gaze input. IEEE Trans Syst Man Cybern 19(7):1527–1534CrossRefGoogle Scholar
  8. 8.
    Veerawan J, Phayung M (2015) Evaluation of a low-cost eye tracking system for computer input. KMUTNB Int J Appl Sci Technol 8(3):185–196Google Scholar
  9. 9.
    Kuno Y, Yagi T, Fujii I, Koga K, Uchikawa Y (1998) Development of eye-gaze input interface using EOG (in Japanese). Trans IPS Jpn 39(5):1455–1462Google Scholar
  10. 10.
    Corno F, Farinetti L, Signorile I (2002) A cost-effective solution for eye-gaze assistive technology. In: Proceedings of IEEE international conference on multimedia and expo, vol 2, pp 433–436Google Scholar
  11. 11.
    Hansen JP, Torning K, Johansen AS, Itoh K, Aoki H (2004) Gaze typing compared with input by head and hand. In: Proceedings of the 2004 symposium on eye tracking research and applications. ACM, pp 131–138Google Scholar
  12. 12.
    Majaranta P, Ahola UK, Spakov O (2009) Fast gaze typing with an adjustable dwell time. In: Proceedings of the SIGCHI conference on human factors in computing systems, ACM, pp 357–360Google Scholar
  13. 13.
    Sato H, Abe K, Ohi S, Ohyama M (2017) A text input system based on information of voluntary blink and eye-gaze using an image analysis (in Japanese). IEEJ Trans EIS 137(4):584–594CrossRefGoogle Scholar
  14. 14.
    Sato H, Abe K, Matsuno S, Ohyama M (2017) A text input system using two types of voluntary blinks and eye-gaze information (in Japanese). In: Proceedings of 2014 annual conference of electronics, information and system society, IEE of Japan, pp 1025–1028Google Scholar
  15. 15.
    Matsuno S, Ito Y, Akehi K, Itakura N, Mizuno T, Mito K (2017) A multiple-choice input interface using slanting eye glance (in Japanese). IEEJ Trans EIS 137(4):621–627CrossRefGoogle Scholar
  16. 16.
    Villanueva A, Cabeza R (2008) A novel gaze estimation system with one calibration point. IEEE Trans Syst Man Cybern Part B (Cybern) 38(4):1123–1138CrossRefGoogle Scholar
  17. 17.
    Agostino G, Mauricio V, Peter JB, Guido M (2017) Evaluation of the Tobii EyeX eye tracking controller and Matlab toolkit for research. Behav Res Methods 49(3):923–946CrossRefGoogle Scholar
  18. 18.
    Sato H, Abe K, Matsuno S, Ohyama M (2018) Improvement of text input system using two types of voluntary blinks and eye-gaze information. In: Proceedings of 23rd Symposium on AROB 2018, pp 854–858Google Scholar
  19. 19.
    Abe K, Ohi S, Ohyama M (2012) Automatic blink detection by the frame splitting method (in Japanese). IEEJ Trans EIS 132(9):1437–1445CrossRefGoogle Scholar
  20. 20.
    Sato H, Abe K, Ohi S, Ohyama M (2016) An automatic classification method for involuntary and two types of voluntary blinks (in Japanese). Trans EIS 136(9):1350–1358CrossRefGoogle Scholar
  21. 21.
    Abe K, Sato H, Ohyama M, Ohi S (2006) Eye-gaze input system for physically disabled computer users (in Japanese). J Inst Image Inf Telev Eng 60(12):1971–1979Google Scholar
  22. 22.
    Akehi K, Matsuno S, Itakura N, Mizuno T, Mito K (2017) Study of non-contact eye glance input interface with video camera (in Japanese). IEEJ Trans EIS 137(4):628–633CrossRefGoogle Scholar

Copyright information

© International Society of Artificial Life and Robotics (ISAROB) 2018

Authors and Affiliations

  • Hironobu Sato
    • 1
    Email author
  • Kiyohiko Abe
    • 2
  • Shogo Matsuno
    • 3
  • Minoru Ohyama
    • 2
  1. 1.College of Science and EngineeringKanto Gakuin UniversityYokohamaJapan
  2. 2.Tokyo Denki UniversityTokyoJapan
  3. 3.Hottolink Inc.TokyoJapan

Personalised recommendations