Abstract
Gesture-based interaction allows for interacting with computers, machines and robots in an intuitive way without direct physical contact. The challenge is that there are no agreed-upon interaction patterns for gesture-based interaction in VR and AR environments. In this paper we have developed a set of 10 gestures and corresponding visualizations in the following categories of gestures: (1) directional movement, (2) flow control, (3) spatial orientation, (4) multifunctional gestures, and (5) tactile gestures. One of the multifunctional gestures and its visualization were selected for usability testing (N = 18) in a 3D car track simulator. We found that the visualization made it faster and easier to understand the interaction made the interaction more precise. Further, we learned that the visualization worked well as guidance to learn to control the car but could be removed after a while as the user had learned the interaction. By combining gestures from the library, gesture-based interaction can be used to control advanced machines, robots and drones in an intuitive and non-strenuous way.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Antoni, S.-T., Sonnenburg, C., Saathoff, T., Schlaefer, A.: Feasibility of interactive gesture control of a robotic microscope. In: Current Directions in Biomedical Engineering, vol. 1, p. 164 (2015)
Arkenbout, E.A., de Winter, J.C.F., Ali, A., Dankelman, J., Breedveld, P.: A gesture-based design tool: assessing 2DOF vs. 4DOF steerable instrument control. PLoS ONE 13(7) (2018). https://doi.org/10.1371/journal.pone.0199367. urn: issn: 1932-6203
Barber, D., Howard, T., Walter, M.: A multimodal interface for real-time soldier-robot teaming (2016)
Gesture Control: Gartner Glossary (2019)
Hartmann, F., Schlaefer, A.: Feasibility of touch-less control of operating room lights. Int. J. Comput. Assist. Radiol. Surg. 8(2), 259–268 (2013). https://doi.org/10.1007/s11548-012-0778-2
Hostetter, A.B., Alibali, M.W.: Psychon. Bull. Rev. 15, 495 (2008). https://doi.org/10.3758/PBR.15.3.495
Jeannerod, M.: Neural simulation of action: a unifying mechanism for motor cognition. NeuroImage 14(1), S103–S109 (2001). https://doi.org/10.1006/nimg.2001.0832
Johnson-Glenberg, M.C.: Immersive VR and education: embodied design principles that include gesture and hand controls (2018)
Lien, J., et al.: Soli: ubiquitous gesture sensing with millimeter wave radar. ACM Trans. Graph. 35(4), 1–19 (2016). https://doi.org/10.1145/2897824.2925953
Liu, J., Luo, Y., Ju, Z.: An interactive astronaut-robot system with gesture control. (Research Article) (Report). Comput. Intell. Neurosci. (2016). https://doi.org/10.1155/2016/7845102
Malizia, A., Bellucci, A.: The artificiality of natural user interfaces. Commun. ACM 55(3) (2012). https://doi.org/10.1145/2093548.2093563
O’hara, K., Harper, R., Mentis, H., Sellen, A., Taylor, A.: On the naturalness of touchless: Putting the & ldquo; interaction & rdquo; back into NUI. ACM Trans. Comput. Hum. Interact. 20(1), 1–25 (2013). https://doi.org/10.1145/2442106.2442111
Nacenta, M.A., Kamber, Y., Qiang, Y., Kristensson, P.O.: Memorability of pre-designed and user-defined gesture sets. In: Paper presented at the Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Paris, France (2013)
Norman, D.A.: The Design of Everyday Things (Rev. and exp. ed. ed.). Basic Books, New York (2013)
Rise, K.: The Potential of Gesture-based Interaction (2020)
Valner, R., Kruusamäe, K., Pryor, M.: TeMoto: intuitive multi-range telerobotic system with natural gestural and verbal instruction interface. Robotics 7(1), 9 (2018). https://doi.org/10.3390/robotics7010009
Wachs, J.P., et al.: A gesture-based tool for sterile browsing of radiology images. J. Am. Med. Inform. Assoc. 15(3), 321–323 (2008). https://doi.org/10.1197/jamia.M2410
Author information
Authors and Affiliations
Corresponding authors
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this paper
Cite this paper
Rise, K., Alsos, O.A. (2020). Gesture-Based Ιnteraction: Visual Gesture Mapping. In: Kurosu, M. (eds) Human-Computer Interaction. Multimodal and Natural Interaction. HCII 2020. Lecture Notes in Computer Science(), vol 12182. Springer, Cham. https://doi.org/10.1007/978-3-030-49062-1_7
Download citation
DOI: https://doi.org/10.1007/978-3-030-49062-1_7
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-49061-4
Online ISBN: 978-3-030-49062-1
eBook Packages: Computer ScienceComputer Science (R0)