Abstract
Gesture control has been a topic of research and development since the 1970s. As hands are humans’ native tool for interaction, it is natural to look at gesture control when talking about next-generation human-computer interaction. Throughout history many have tried to take advantage of the potential of gesture control, without huge success. Gesture-based interaction for consumer electronics have been widely explored using gesture to control TVs, mobile phones and drones, without making it a preferred interaction pattern. It is hard to argue that screen-based media is a preferred application for gesture-based interaction due to the lack of clear benefits of changing a known interaction pattern. Digital objects without screen interfaces show potential for use of gesture control but lacks the possibility of giving proper feedback to make it intuitive. Considering design principles as visibility, feedback and constraints, VR and AR stands out as the most promising medias to apply gesture-based interaction. The potential is not only determined by the medium, but also in what domain it is applied to. Domains like education, healthcare, robotics, heavy industry and space show clear benefits. When designing for the global market (B2C), social acceptance, cultural differences and timing are complications one has to consider. What goes for any case applying this technology; gesture-based interaction should not be used because it is possible, but because it is needed.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Abraham, L., Urru, A., Normani, N., Wilk, M., Walsh, M.: Hand tracking and gesture recognition using lensless smart sensors. Sensors 18(9) (2018). https://doi.org/10.3390/s18092834
Antoni, S.-T., Sonnenburg, C., Saathoff, T., Schlaefer, A.: Feasibility of interactive gesture control of a robotic microscope. In: Current Directions in Biomedical Engineering, vol. 1, pp. 164 (2015)
Arkenbout, E.A., de Winter, J.C.F., Ali, A., Dankelman, J., Breedveld, P.: A gesture-based design tool: assessing 2DOF vs. 4DOF steerable instrument control. PLoS ONE 13(7) (2018). https://doi.org/10.1371/journal.pone.0199367. urn: issn: 1932-6203
Barber, D., Howard, T., Walter, M.: A multimodal interface for real-time soldier-robot teaming (2016)
Engelkamp, J., Zimmer, H.D.: Motor similarity in subject-performed tasks. Psychol. Res. 57(1), 47–53 (1994). https://doi.org/10.1007/bf00452995
Gesture Control: Gartner Glossary (2019)
Goldin-Meadow, S.: Learning through gesture. Wiley Interdisc. Rev. Cogn. Sci. 2(6), 595–607 (2011). https://doi.org/10.1002/wcs.132
Hartmann, F., Schlaefer, A.: Feasibility of touch-less control of operating room lights. Int. J. Comput. Assist. Radiol. Surg. 8(2), 259–268 (2013). https://doi.org/10.1007/s11548-012-0778-2
Hillebrand, G., Bauer, M., Achatz, K., Klinker, G.: Inverse kinematic infrared optical finger tracking (2006)
Haans, A., Ijsselsteijn, W.A.: Embodiment and telepresence: toward a comprehensive theoretical framework. Interact. Comput. 24(4), 211–218 (2012). https://doi.org/10.1016/j.intcom.2012.04.010
Jeffrey, J.: Digital dome versus desktop display in an educational game: gates of Horus. Int. J. Gaming Comput. Mediated Simul. (IJGCMS) 3(1), 13–32 (2011). https://doi.org/10.4018/jgcms.2011010102
Kalgaonkar, K., Raj, B.: One-handed gesture recognition using ultrasonic Doppler sonar. In: Paper Presented at the 2009 IEEE International Conference on Acoustics, Speech and Signal Processing, 19–24 April 2009
Kim, D., et al.: Digits: freehand 3D interactions anywhere using a wrist-worn gloveless sensor. In: Paper Presented at the Proceedings of the 25th Annual ACM Symposium on User Interface Software and Technology, Cambridge, Massachusetts, USA (2012)
Lee, J.C.: Know Your Platform-Chapter 22. Elsevier Inc. (2011)
Lien, J., et al.: Soli: ubiquitous gesture sensing with millimeter wave radar. ACM Trans. Graph. 35(4), 1–19 (2016). https://doi.org/10.1145/2897824.2925953
Liu, J., Luo, Y., Ju, Z.: An interactive astronaut-robot system with gesture control (Research Article) (Report). Comput. Intell. Neurosci. 2016 (2016). https://doi.org/10.1155/2016/7845102
Ma, J., Xu, M., Du, Y.: A Usability Study on In-Vehicle Gesture Control, vol. 2016 (2016)
Malizia, A., Bellucci, A.: The artificiality of natural user interfaces. Commun. ACM 55(3) (2012). https://doi.org/10.1145/2093548.2093563
McDonald, R.: An Astronaut Smart Glove to Explore The Moon, Mars and Beyond [Press release] (2019). https://www.seti.org/press-release/astronaut-smart-glove-explore-moon-mars-and-beyond
Megalingam, R.K., Rangan, V., Krishnan, S., Alinkeezhil, A.B.E.: IR sensor-based gesture control wheelchair for stroke and SCI patients. IEEE Sens. J. 16(17), 6755–6765 (2016). https://doi.org/10.1109/JSEN.2016.2585582
Meier, A., Goto, K., Wörmann, M.: Thumbs up to gesture controls? A cross-cultural study on spontaneous gestures, vol. 8528, pp. 211–217 (2014)
Noor, T.H.: A gesture recognition system for gesture control on Internet of Things services. J. Theoret. Appl. Inf. Technol. 96(12), 3886–3895 (2018)
Norman, D.A.: The way I see it: Natural user interfaces are not natural. Interactions 17(3) (2010). https://doi.org/10.1145/1744161.1744163
Norman, D.A.: The design of everyday things (Rev. and exp. ed. ed.). Basic Books, New York (2013)
O’hara, K., Harper, R., Mentis, H., Sellen, A., Taylor, A.: On the naturalness of touchless: putting the “interaction” back into NUI. ACM Trans. Comput. Hum. Interact. 20(1), 1–25 (2013). https://doi.org/10.1145/2442106.2442111
Pietrusewicz, K.: Gestures can control cranes. Control Engineering, n/a (2014)
Premaratne, P.: Historical development of hand gesture recognition. Human Computer Interaction Using Hand Gestures, pp. 5–29. Springer, Singapore (2014). https://doi.org/10.1007/978-981-4585-69-9_2
Rehg, J.M., Kanade, T.: DigitEyes: vision-based hand tracking for human-computer interaction. In: Paper Presented at the Proceedings of 1994 IEEE Workshop on Motion of Non-rigid and Articulated Objects, 11–12 November 1994
Rico, J., Brewster, S.: Usable gestures for mobile interfaces: evaluating social acceptability. In: Paper presented at the Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Atlanta, Georgia, USA (2010)
Sathiyanarayanan, M., Rajan, S.: MYO armband for physiotherapy healthcare: a case study using gesture recognition application (2016)
Shan, C.: Gesture control for consumer electronics. In: Shao, L., Shan, C., Luo, J., Etoh, M. (eds.) Multimedia Interaction and Intelligent User Interfaces: Principles, Methods and Applications, pp. 107–128. Springer, London (2010). https://doi.org/10.1007/978-1-84996-507-1_5
Soldan, S., Bonow, G., Kroll, A.: RoboGasInspector - a mobile robotic system for remote leak sensing and localization in large industrial environments: overview and first results. IFAC Proc. Volumes 45(8), 33–38 (2012). https://doi.org/10.3182/20120531-2-NO-4020.00005
Stanney, K.M., Hash, P.: Locus of user-initiated control in virtual environments: influences on cybersickness. Presence Teleoperators Virtual Environ. 7(5), 447–459 (1998). https://doi.org/10.1162/105474698565848
Stevenson, D., et al.: Evidence from the surgeons: gesture control of image data displayed during surgery. Behav. Inf. Technol. 35(12), 1063–1079 (2016). https://doi.org/10.1080/0144929x.2016.1203025
Sturman, D.J., Zeltzer, D.: A survey of glove-based input. IEEE Comput. Graphics Appl. 14(1), 30–39 (1994). https://doi.org/10.1109/38.250916
Underkoffler, J.: Pointing to the future of UI. In: Paper presented at the TED2010 (2010). https://www.ted.com/talks/john_underkoffler_drive_3d_data_with_a_gesture
Valner, R., Kruusamäe, K., Pryor, M.: TeMoto: intuitive multi-range telerobotic system with natural gestural and verbal instruction interface. Robotics 7(1), 9 (2018). https://doi.org/10.3390/robotics7010009
Wachs, J.P., et al.: A gesture-based tool for sterile browsing of radiology images. J. Am. Med. Inform. Assoc. 15(3), 321–323 (2008). https://doi.org/10.1197/jamia.M2410
Waldrop, M.M.: Why we are teaching science wrong, and how to make it right. Nature 523(7560), 272–274 (2015). https://doi.org/10.1038/523272a
Xu, J., Zhang, X., Zhou, M.: A high-security and smart interaction system based on hand gesture recognition for Internet of Things. Secur. Commun. Netw. 2018, 11 (2018). https://doi.org/10.1155/2018/4879496
Zhao, H., Wang, S., Zhou, G., Zhang, D.: Ultigesture: a wristband-based platform for continuous gesture control in healthcare. Smart Health 11, 45–65 (2019). https://doi.org/10.1016/j.smhl.2017.12.003
Zhou, T., Cabrera, M.E., Wachs, J.P., Low, T., Sundaram, C.: A comparative study for telerobotic surgery using free hand gestures. J. Hum. Robot Interact. 5(2), 1–28 (2016). https://doi.org/10.5898/jhri.5.2.zhou
Zöller, I., Bechmann, R., Abendroth, B.: Possible applications for gestures while driving. Automot. Engine Technol. 3(1), 11–20 (2018). https://doi.org/10.1007/s41104-017-0023-7
Author information
Authors and Affiliations
Corresponding authors
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this paper
Cite this paper
Rise, K., Alsos, O.A. (2020). The Potential of Gesture-Based Interaction. In: Kurosu, M. (eds) Human-Computer Interaction. Multimodal and Natural Interaction. HCII 2020. Lecture Notes in Computer Science(), vol 12182. Springer, Cham. https://doi.org/10.1007/978-3-030-49062-1_8
Download citation
DOI: https://doi.org/10.1007/978-3-030-49062-1_8
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-49061-4
Online ISBN: 978-3-030-49062-1
eBook Packages: Computer ScienceComputer Science (R0)