CamType: assistive text entry using gaze with an off-the-shelf webcam

  • Yi LiuEmail author
  • Bu-Sung Lee
  • Deepu Rajan
  • Andrzej Sluzek
  • Martin J. McKeown
Original Paper


As modern assistive technology advances, eye-based text entry systems have been developed to help a subset of physically challenged people to improve their communication ability. However, speed of text entry in early eye-typing system tends to be relatively slow due to dwell time. Recently, dwell-free methods have been proposed which outperform the dwell-based systems in terms of speed and resilience, but the extra eye-tracking device is still an indispensable equipment. In this article, we propose a prototype of eye-typing system using an off-the-shelf webcam without the extra eye tracker, in which the appearance-based method is proposed to estimate people’s gaze coordinates on the screen based on the frontal face images captured by the webcam. We also investigate some critical issues of the appearance-based method, which helps to improve the estimation accuracy and reduce computing complexity in practice. The performance evaluation shows that eye typing with webcam using the proposed method is comparable to the eye tracker under a small degree of head movement.


Assistive technology Eye-typing system Dwell-free methods Appearance-based method 



This work is a collaboration with the Joint NTU-UBC Research Centre of Excellence in Active Living for the Elderly (LILY).


  1. 1.
    Adjouadi, M., Sesin, A., Ayala, M., Cabrerizo, M.: Remote eye gaze tracking system as a computer interface for persons with severe motor disability. Springer, Berlin (2004)Google Scholar
  2. 2.
    Baltru, T., Robinson, P., Morency, L.-P., et al.: Openface: an open source facial behavior analysis toolkit. In: 2016 IEEE Winter Conference on Applications of Computer Vision (WACV), pp. 1–10 (2016)Google Scholar
  3. 3.
    Brolly, X. L., Mulligan, J. B.: Implicit calibration of a remote gaze tracker. In: Conference on Computer Vision and Pattern Recognition Workshop, 2004, CVPRW’04, p. 134 (2004)Google Scholar
  4. 4.
    Caligari, M., Godi, M., Guglielmetti, S., Franchignoni, F., Nardone, A.: Eye tracking communication devices in amyotrophic lateral sclerosis: impact on disability and quality of life. Amyotroph. Lateral Scler. Frontotemporal Degener. 14(7–8), 546–552 (2013)Google Scholar
  5. 5.
    Cerrolaza, J.J., Villanueva, A., Cabeza, R.: Taxonomic study of polynomial regressions applied to the calibration of video-oculographic systems. In: Proceedings of the 2008 Symposium on Eye Tracking Research and Applications, pp. 259–266 (2008)Google Scholar
  6. 6.
    Chen, J., Ji, Q.: Probabilistic gaze estimation without active personal calibration. In: 2011 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 609-616 (2011)Google Scholar
  7. 7.
    Davies, M.: Word Frequency Data from the Corpus of Contemporary American English (COCA) (2011). Retrieved from
  8. 8.
    Ebisawa, Y., Satoh, S.-I.: Effectiveness of pupil area detection technique using two light sources and image difference method. In: Proceedings of the 15th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, pp. 1268–1269 (1993)Google Scholar
  9. 9.
    Hansen, D.W., Ji, Q.: In the eye of the beholder: a survey of models for eyes and gaze. IEEE Trans. Pattern Anal. Mach. Intell. 32(3), 478–500 (2010)Google Scholar
  10. 10.
    Hoppe, S., Löchtefeld, M., Daiber, F.: Eype—using eye-traces for eye-typing. In: Workshop on Grand Challenges in Text Entry (CHI 2013) (2013)Google Scholar
  11. 11.
    Huey, E.B.: The Psychology and Pedagogy of Reading. Macmillan, New York (1908)Google Scholar
  12. 12.
    Jacob, R., Karn, K.S.: Eye tracking in human–computer interaction and usability research: ready to deliver the promises. Mind 2(3), 4 (2003)Google Scholar
  13. 13.
    Kim, S.-T., Choi, K.-A., Shin, Y.-G., Ko, S.-J.: A novel iris center localization based on circle fitting using radially sampled features. In: 2015 IEEE International Symposium on Consumer Electronics (ISCE), pp. 1–2 (2015)Google Scholar
  14. 14.
    Kocejko, T., Bujnowski, A., Wtorek, J.: Eye-mouse for disabled. In: Hippe, Z.S., Kulikowski, J.L. (eds.) Human–Computer Systems Interaction, pp. 109–122. Springer, Berlin (2009)Google Scholar
  15. 15.
    Kotani, K., Yamaguchi, Y., Asao, T., Horii, K.: Design of eye-typing interface using saccadic latency of eye movement. Int. J. Hum. Comput. Interaction 26(4), 361–376 (2010)Google Scholar
  16. 16.
    Krafka, K., Khosla, A., Kellnhofer, P., Kannan, H., Bhandarkar, S., Matusik, W., Torralba, A.: Eye tracking for everyone. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 2176–2184 (2016)Google Scholar
  17. 17.
    Kristensson, P.O., Vertanen, K.: The potential of dwell-free eye-typing for fast assistive gaze communication. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 241–244 (2012)Google Scholar
  18. 18.
    Kristensson, P.-O., Zhai, S.: Shark 2: a large vocabulary shorthand writing system for pen-based computers. In: Proceedings of the 17th Annual ACM Symposium on User Interface Software and Technology, pp. 43–52 (2004)Google Scholar
  19. 19.
    Liu, Y., Lee, B.-S., McKeown, M., Lee, C.: A robust recognition approach in eye-based dwell-free typing. In: Proceedings of 2015 International Conference on Progress in Informatics and Computing, pp. 5–9 (2015)Google Scholar
  20. 20.
    Liu, Y., Lee, B.-S., McKeown, M.J.: Robust eye-based dwell-free typing. Int. J. Hum. Comput. Interaction 32(9), 682–694 (2016). Google Scholar
  21. 21.
    Liu, Y., Zhang, C., Lee, C., Lee, B.-S., Chen, A. Q.: Gazetry: swipe text typing using gaze. In: Proceedings of the Annual Meeting of the Australian Special Interest Group for Computer Human Interaction, pp. 192–196 (2015)Google Scholar
  22. 22.
    Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Inferring human gaze from appearance via adaptive linear regression. In: 2011 IEEE International Conference on Computer Vision (ICCV), pp. 153–160 (2011)Google Scholar
  23. 23.
    Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Trans. Pattern Anal. Mach. Intell. 36(10), 2033–2046 (2014)Google Scholar
  24. 24.
    MacKenzie, I.S., Zhang, X.: Eye typing using word and letter prediction and a fixation algorithm. In: Proceedings of the 2008 Symposium on Eye Tracking Research and Applications, pp. 55–58 (2008)Google Scholar
  25. 25.
    Majaranta, P., Ahola, U.-K., Špakov, O.: Fast gaze typing with an adjustable dwell time. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 357–360 (2009)Google Scholar
  26. 26.
    Majaranta, P., Räihä, K.-J.: Twenty years of eye typing: systems and design issues. In: Proceedings of the 2002 Symposium on Eye Tracking Research and Applications, pp. 15–22. ACM, New York (2002).
  27. 27.
    Morimoto, C.H., Mimica, M.R.: Eye gaze tracking techniques for interactive applications. Comput. Vis. Image Underst. 98(1), 4–24 (2005)Google Scholar
  28. 28.
    Murata, A.: Eye-gaze input versus mouse: cursor control as a function of age. Int. J. Hum. Comput. Interaction 21(1), 1–14 (2006)MathSciNetGoogle Scholar
  29. 29.
    Ohno, T.: Eyeprint: using passive eye trace from reading to enhance document access and comprehension. Int. J. Hum. Comput. Interaction 23(1–2), 71–94 (2007)Google Scholar
  30. 30.
    Pedrosa, D., Pimentel, M.G., Truong, K.N.: Filteryedping: a dwell-free eye typing technique. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems, pp. 303–306 (2015)Google Scholar
  31. 31.
    Pedrosa, D., Pimentel, M.D.G., Wright, A., Truong, K.N.: Filteryedping: design challenges and user performance of dwell-free eye typing. ACM Trans. Access. Comput. 6(1), 3 (2015)Google Scholar
  32. 32.
    Räihä, K.-J., Ovaska, S.: An exploratory study of eye typing fundamentals: dwell time, text entry rate, errors, and workload. In: Proceedings of the SIGCHI conference on human factors in computing systems, pp. 3001–3010 (2012)Google Scholar
  33. 33.
    Sarcar, S., Panwar, P., Chakraborty, T.: Eyek: an efficient dwell-free eye gaze-based text entry system. In: Proceedings of the 11th Asia Pacific Conference on Computer Human Interaction, pp. 215–220 (2013)Google Scholar
  34. 34.
    Sesma, L., Villanueva, A., Cabeza, R.: Evaluation of pupil center-eye corner vector for gaze estimation using a web cam. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 217–220. ACM, New York (2012).
  35. 35.
    Skodras, E., Fakotakis, N.: Precise localization of eye centers in low resolution color images. Image Vis. Comput. 36, 51–60 (2015)Google Scholar
  36. 36.
    Spataro, R., Ciriacono, M., Manno, C., La Bella, V.: The eye-tracking computer device for communication in amyotrophic lateral sclerosis. Acta Neurol. Scand. 130(1), 40–45 (2014)Google Scholar
  37. 37.
    Su, M.-C., Wang, K.-C., Chen, G.-D.: An eye tracking system and its application in aids for people with severe disabilities. Biomed. Eng. Appl. Basis Commun. 18(06), 319–327 (2006)Google Scholar
  38. 38.
    Sugano, Y., Matsushita, Y., Sato, Y., Koike, H.: An incremental learning method for unconstrained gaze estimation. In: European Conference on Computer Vision, pp. 656–667 (2008)Google Scholar
  39. 39.
    Tan, K.-H., Kriegman, D. J., Ahuja, N.: Appearance-based eye gaze estimation. In: Proceedings of Sixth IEEE Workshop on Applications Of Computer Vision, 2002 (WACV 2002), pp. 191–195 (2002)Google Scholar
  40. 40.
    Urbina, M.H., Huckauf, A.: Alternatives to single character entry and dwell time selection on eye typing. In: Proceedings of the 2010 Symposium on Eye-tracking Research and Applications, pp. 315–322 (2010)Google Scholar
  41. 41.
    Vadillo, M.A., Street, C.N., Beesley, T., Shanks, D.R.: A simple algorithm for the offline recalibration of eye-tracking data through best-fitting linear transformation. Behav. Res. Methods 47(4), 1365–1376 (2015)Google Scholar
  42. 42.
    Valenti, R., Gevers, T.: Accurate eye center location through invariant isocentric patterns. IEEE Trans. Pattern Anal. Mach. Intell. 34(9), 1785–1798 (2012)Google Scholar
  43. 43.
    Villanueva, A., Cabeza, R., Porta, S.: Gaze tracking system model based on physical parameters. Int. J. Pattern Recognit. Artif. Intell. 21(05), 855–877 (2007)Google Scholar
  44. 44.
    Viola, P., Jones, M.J.: Robust real-time face detection. Int. J. Comput. Vis. 57(2), 137–154 (2004)Google Scholar
  45. 45.
    Wang, J.-G., Sung, E., Venkateswarlu, R.: Estimating the eye gaze from one eye. Comput. Vis. Image Underst. 98(1), 83–103 (2005)Google Scholar
  46. 46.
    Wang, P., Green, M. B., Ji, Q., Wayman, J.: Automatic eye detection and its validation. In: IEEE Computer Society Conference on Computer Vision and Pattern Recognition-workshops, 2005. CVPR Workshops, pp. 164 (2005)Google Scholar
  47. 47.
    Ward, D.J., MacKay, D.J.: Artificial intelligence: fast hands-free writing by gaze direction. Nature 418, 838 (2002)Google Scholar
  48. 48.
    Williams, O., Blake, A., Cipolla, R.: Sparse and semi-supervised visual mapping with the \(\text{s}^{\wedge }\)3GP. In: 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR’06), vol. 1, pp. 230–237 (2006)Google Scholar
  49. 49.
    Yu, P., Zhou, J., Wu, Y.: Learning reconstruction-based remote gaze estimation. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 3447–3455 (2016)Google Scholar
  50. 50.
    Zander, T.O., Gaertner, M., Kothe, C., Vilimek, R.: Combining eye gaze input with a brain-computer interface for touchless human–computer interaction. Int. J. Hum. Comput. Interaction 27(1), 38–51 (2010)Google Scholar
  51. 51.
    Zhang, X., Sugano, Y., Fritz, M., Bulling, A.: Appearance-based gaze estimation in the wild. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 4511–4520 (2015)Google Scholar
  52. 52.
    Zhang, Y., Hornof, A.J.: Easy post-hoc spatial recalibration of eye tracking data. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 95–98 (2014)Google Scholar
  53. 53.
    Zhou, Z.-H., Geng, X.: Projection functions for eye detection. Pattern Recognit. 37(5), 1049–1056 (2004)zbMATHGoogle Scholar
  54. 54.
    Zhu, Z., Ji, Q.: Robust real-time eye detection and tracking under variable lighting conditions and various face orientations. Comput. Vis. Image Underst. 98(1), 124–154 (2005)Google Scholar

Copyright information

© Springer-Verlag GmbH Germany, part of Springer Nature 2019

Authors and Affiliations

  1. 1.Nanyang Institute of Technology in Health and Medicine, Interdisciplinary Graduate SchoolNanyang Technological UniversitySingaporeSingapore
  2. 2.School of Computer Science and EngineeringNanyang Technological UniversitySingaporeSingapore
  3. 3.Department of Electrical and Computer EngineeringKhalifa UniversityAbu DhabiUAE
  4. 4.Department of MedicineThe University of British ColumbiaVancouverCanada

Personalised recommendations