Abstract
Head operated interfaces offer touchless interaction with electronic devices for physically challenged people who are unable to operate standard input equipment. The great challenge of such interfaces is text entry. Most existing approaches are based on camera mouse where the on-screen keyboard is operated by the pointer controlled with head movements. Movements of the head are also employed to cycle through keys or their groups to access the intended letter. While the process of direct selection requires substantial precision the traverse procedure is time consuming. The main contribution of this paper is proposition of the Two-Letters-Key Keyboard for touchless typing with head movements. The solution offers substantial acceleration in accessing the desired keys. The typing proceeds with directional head movements and only two consecutive moves are required to reach the expected key. No additional mechanisms (like eye blink or mouth open) are required for head typing.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Jacob, R.J.K., Karn, K.S.: Eye tracking in human-computer interaction and usability research: ready to deliver the promises. In: The Minds Eye: Cognitive and Applied Aspects of Eye Movement Research, pp. 573–605. Elsevier Science, Oxford (2003)
Rebman, C.M., Aiken, M.W., Cegielski, C.G.: Speech recognition in the human-computer interface. Inf. Manage. 40(6), 509–519 (2003)
Graimann, B., Allison, B.Z., Pfurtscheller, G.: Brain-computer interfaces: a gentle introduction. In: Graimann, B., Pfurtscheller, G. (eds.) Brain Computer Interfaces: Revolutionizing Human-Computer Interaction. The Frontiers Collection. Springer, Heidelberg (2010)
Mantiuk, R., Kowalik, M., Nowosielski, A., Bazyluk, B.: Do-it-yourself eye tracker: low-cost pupil-based eye tracker for computer graphics applications. In: Schoeffmann, K., Merialdo, B., Hauptmann, A.G., Ngo, C.-W., Andreopoulos, Y., Breiteneder, C. (eds.) MMM 2012. LNCS, vol. 7131, pp. 115–125. Springer, Heidelberg (2012). doi:10.1007/978-3-642-27355-1_13
Tu, J., Tao, H., Huang, T.: Face as mouse through visual face tracking. Comput. Vis. Image Underst. 108(2007), 35–40 (2007)
Nabati, M., Behrad, A.: 3D head pose estimation and camera mouse implementation using a monocular video camera. Signal Image Video Process. 9(1), 39–44 (2015)
Morris, T., Chauhan, V.: Facial feature tracking for cursor control. J. Netw. Comput. Appl. 29(1), 62–80 (2006)
Varona, J., Manresa-Yee, C., Perales, F.J.: Hands-free vision-based interface for computer accessibility. J. Netw. Comput. Appl. 31(4), 357–374 (2008)
Santis, A., Iacoviello, D.: Robust real time eye tracking for computer interface for disabled people. Comput. Methods Progr. Biomed. 96(1), 1–11 (2009)
Shin, Y., Ju, J.S., Kim, E.Y.: Welfare interface implementation using multiple facial features tracking for the disabled people. Pattern Recognit. Lett. 29(2008), 1784–1796 (2008)
Bian, Z.-P., Hou, J., Chau, L.-P., Magnenat-Thalmann, N.: Facial position and expression-based human-computer interface for persons with tetraplegia. IEEE J. Biomed. Health Inform. 20(3), 915–924 (2016)
Gizatdinova, Y., Spakov, O., Surakka, V.: Face typing: vision-based perceptual interface for hands-free text entry with a scrollable virtual keyboard. In: IEEE Workshop on Applications of Computer Vision, Breckenridge, CO, USA, pp. 81–87 (2012)
Nowosielski, A.: 3-steps keyboard: reduced interaction interface for touchless typing with head movements. In: Kurzynski, M., Wozniak, M., Burduk, R. (eds.) CORES 2017. AISC, vol. 578, pp. 229–237. Springer, Cham (2018). doi:10.1007/978-3-319-59162-9_24
Nowosielski, A., Chodyła, Ł.: Touchless input interface for disabled. In: Burduk, R., Jackowski, K., Kurzynski, M., Wozniak, M., Zolnierek, A. (eds.) Proceedings of the 8th International Conference on Computer Recognition Systems CORES 2013. AISC, vol. 226. Springer, Heidelberg (2013)
Assistive Context-Aware Toolkit (ACAT). Project page (2017). https://01.org/acat
QVirtboard. Project page (2017). http://qvirtboard.sourceforge.net
MacKenzie, I.S., Soukoreff, R.W., Helga, J.: 1 thumb, 4 buttons, 20 words per minute: design and evaluation of H4-writer. In: Proceedings of the 24th Annual ACM Symposium on User Interface Software and Technology (UIST 2011), Santa Barbara, California, USA, pp. 471–480 (2011)
Komninos, A., Dunlop, M.: Text input on a smart watch. IEEE Pervasive Comput. 13, 50–58 (2014)
Kukharev, G., Nowosielski, A.: Fast and efficient algorithm for face detection in colour images. Mach. Graph. Vis. 13(4), 377–399 (2004)
Viola, P., Jones, M.: Robust real-time face detection. Int. J. Comput. Vis. 57(2), 137–154 (2004)
Forczmański, P.: Performance evaluation of selected thermal imaging-based human face detectors. In: Kurzynski, M., Wozniak, M., Burduk, R. (eds.) CORES 2017. AISC, vol. 578, pp. 170–181. Springer, Cham (2018). doi:10.1007/978-3-319-59162-9_18
Farfade, S.S., Saberian, M., Li, L.-J.: Multi-view face detection using deep convolutional neural networks. In: Proceedings of the 5th ACM on International Conference on Multimedia Retrieval (ICMR 2015), pp. 643–650 (2015)
Torr, P.H.S., Zisserman, A.: MLESAC: a new robust estimator with application to estimating image geometry. Comput. Vis. Image Underst. 78(1), 138–156 (2000)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2017 Springer International Publishing AG
About this paper
Cite this paper
Nowosielski, A. (2017). Two-Letters-Key Keyboard for Predictive Touchless Typing with Head Movements. In: Felsberg, M., Heyden, A., Krüger, N. (eds) Computer Analysis of Images and Patterns. CAIP 2017. Lecture Notes in Computer Science(), vol 10424. Springer, Cham. https://doi.org/10.1007/978-3-319-64689-3_6
Download citation
DOI: https://doi.org/10.1007/978-3-319-64689-3_6
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-64688-6
Online ISBN: 978-3-319-64689-3
eBook Packages: Computer ScienceComputer Science (R0)