Skip to main content

A Study for the Identification of a Full-Body Gesture Language for Enabling Natural User Interaction

  • Conference paper
  • First Online:
Book cover Human-Computer Interaction (HCI-COLLAB 2019)

Abstract

Most proposals addressing gesture recognition consider two main stages: gesture identification in a source and association between gestures and meanings. About the semantic meaning of gestures, it is important to consider that due to their cultural and linguistic specificity each of them may be mapped to many concepts and vice versa, making them ambiguous and incompletely defined. From the HCI perspective, there is work in the literature presenting elicitation studies on which researchers find gestures sets to allow user interaction with tailored applications under determined contexts. In this paper, we present a full-body language for enabling gesture-based interaction in different contexts. For this purpose, 70 users were asked to provide the gestures they would use as commands within different existing applications, keeping awareness of the relationship between the tasks they were asked to do on the applications and abstract tasks, resulting on 980 gestures which were compared for generating a reduced set of 68 gestures consisting on their graphic representation, a textual description, an anthropometric characterizations for each of them, and generic labels. Interpretation and insights obtained during the experiment are also reported.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Mitra, S., Acharya, T.: Gesture recognition: a survey. IEEE Trans. Syst. Man Cybern. C Appl. Rev. 37(3), 311–324 (2007). https://doi.org/10.1109/tsmcc.2007.893280

    Article  Google Scholar 

  2. Schlömer, T., Poppinga, B., Henze, N., Boll, S.: Gesture recognition with a Wii controller. In: Proceedings of the 2nd International Conference on Tangible and Embedded Interaction, Bonn, Germany, 18–20 February, pp. 11–14. ACM, New York (2008). https://doi.org/10.1145/1347390.1347395

  3. Liang, R.H., Ouhyoung, M.: A real-time continuous gesture recognition system for sign language. In: Proceedings of the Third IEEE International Conference on Automatic Face and Gesture Recognition, Nara, Japan, 14–16 April, pp. 558–567. IEEE (1998). https://doi.org/10.1109/afgr.1998.671007

  4. Daniloff, J.K., Noll, J.D., Fristoe, M., Lloyd, L.L.: Gesture recognition in patients with aphasia. J. Speech Hear. Disord. 47(1), 43–49 (1982). https://doi.org/10.1044/JSHD.4701.43

    Article  Google Scholar 

  5. Jia, P., Hu, H.H., Lu, T., Yuan, K.: Head gesture recognition for hands-free control of an intelligent wheelchair. Ind. Robot 34(1), 60–68 (2007). https://doi.org/10.1108/01439910710718469

    Article  Google Scholar 

  6. Norman, D.A.: Natural user interfaces are not natural. Interactions 17(3), 6–10 (2010). https://doi.org/10.1145/1744161.1744163

    Article  Google Scholar 

  7. Malima, A.K., Özgür, E., Çetin, M.: A fast algorithm for vision-based hand gesture recognition for robot control. In: 14th Signal Processing and Communications Applications, Antalya, Turkey, 17–19 April. IEEE (2006). https://doi.org/10.1109/siu.2006.1659822

  8. Wan, Q., et al.: Gesture recognition for smart home applications using portable radar sensors. In: 2014 36th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC). IEEE (2014). https://doi.org/10.1109/embc.2014.6945096

  9. Chandler, J., Schwarz, N.: How extending your middle finger affects your perception of others: learned movements influence concept accessibility. J. Exp. Soc. Psychol. 45(1), 123–128 (2009). https://doi.org/10.1016/J.JESP.2008.06.012

    Article  Google Scholar 

  10. Calvary, G., Coutaz, J., Thevenin, D., Limbourg, Q., Bouillon, L., Vanderdonckt, J.: A unifying reference framework for multi-target user interfaces. Interact. Comput. 15(3), 289–308 (2003). https://doi.org/10.1016/S0953-5438(03)00010-9

    Article  Google Scholar 

  11. Kaklanis, N., Moschonas, P., Moustakas, K., Tzovaras, D.: Virtual user models for the elderly and disabled for automatic simulated accessibility and ergonomy evaluation of designs. Univ. Access Inf. Soc. 12(4), 403–425 (2013). https://doi.org/10.1007/s10209-012-0281-0

    Article  Google Scholar 

  12. Wobbrock, J.O., Morris, M.R., Wilson, A.D.: User-defined gestures for surface computing. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Boston, USA, 04–09 April, pp. 1083–1092. ACM, New York (2009). https://doi.org/10.1145/1518701.1518866

  13. Piumsomboon, T., Clark, A., Billinghurst, M., Cockburn, A.: User-defined gestures for augmented reality. In: Kotzé, P., Marsden, G., Lindgaard, G., Wesson, J., Winckler, M. (eds.) INTERACT 2013. LNCS, vol. 8118, pp. 282–299. Springer, Heidelberg (2013). https://doi.org/10.1007/978-3-642-40480-1_18

    Chapter  Google Scholar 

  14. Obaid, M., Häring, M., Kistler, F., Bühling, R., André, E.: User-defined body gestures for navigational control of a humanoid robot. In: Ge, S.S., Khatib, O., Cabibihan, J.-J., Simmons, R., Williams, M.-A. (eds.) ICSR 2012. LNCS (LNAI), vol. 7621, pp. 367–377. Springer, Heidelberg (2012). https://doi.org/10.1007/978-3-642-34103-8_37

    Chapter  Google Scholar 

  15. Motaghi, H.: Creativity and technology in the context of creative industries, dissertation. Université du Québec à Montréal (2015)

    Google Scholar 

  16. Dow, S., Lee, J., Oezbek, C., MacIntyre, B., Bolter, J.D., Gandy, M.: Wizard of Oz interfaces for mixed reality applications. In: CHI 2005 Extended Abstracts on Human Factors in Computing Systems, Portland, USA, 2–7 April, pp. 1339–1342. ACM, New York (2005). https://doi.org/10.1145/1056808.1056911

  17. Höysniemi, J., Hämäläinen, P., Turkki, L.: Wizard of Oz prototyping of computer vision-based action games for children. In: Proceedings of the 2004 Conference on Interaction Design and Children: Building a Community, Maryland, USA, 01–03 June, pp. 27–34. ACM, New York (2004). https://doi.org/10.1145/1017833.1017837

  18. Hoff, L., Hornecker, E., Bertel, S.: Modifying gesture elicitation: do kinaesthetic priming and increased production reduce legacy bias? In: Proceedings of the TEI 2016: Tenth International Conference on Tangible, Embedded, and Embodied Interaction. ACM (2016). https://doi.org/10.1145/2839462.2839472

  19. Connell, S., Kuo, P.Y., Liu, L., Piper, A.M.: A Wizard-of-Oz elicitation study examining child-defined gestures with a whole-body interface. In: Proceedings of the 12th International Conference on Interaction Design and Children, pp. 277–280. ACM, June 2013. https://doi.org/10.1145/2485760.2485823

  20. Liu, J., Zhong, L., Wickramasuriya, J., Vasudevan, V.: uWave: accelerometer-based personalized gesture recognition and its applications. Pervasive Mob. Comput. 5(6), 657–675 (2009). https://doi.org/10.1016/J.PMCJ.2009.07.007

    Article  Google Scholar 

  21. Morris, M.R., Danielescu, A., Drucker, S., Fisher, D., Lee, B., Wobbrock, J.O.: Reducing legacy bias in gesture elicitation studies. Interactions 21(3), 40–45 (2014)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to David Céspedes-Hernández .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Céspedes-Hernández, D., González-Calleros, J.M. (2019). A Study for the Identification of a Full-Body Gesture Language for Enabling Natural User Interaction. In: Ruiz, P., Agredo-Delgado, V. (eds) Human-Computer Interaction. HCI-COLLAB 2019. Communications in Computer and Information Science, vol 1114. Springer, Cham. https://doi.org/10.1007/978-3-030-37386-3_4

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-37386-3_4

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-37385-6

  • Online ISBN: 978-3-030-37386-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics