Advertisement

Head and Shoulders Gestures: Exploring User-Defined Gestures with Upper Body

  • Jean VanderdoncktEmail author
  • Nathan Magrofuoco
  • Suzanne Kieffer
  • Jorge Pérez
  • Ysabelle Rase
  • Paolo Roselli
  • Santiago Villarreal
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11584)

Abstract

This paper presents empirical results about user-defined gestures for head and shoulders by analyzing 308 gestures elicited from 22 participants for 14 referents materializing 14 different types of tasks in IoT context of use. We report an overall medium consensus but with medium variance (mean: .263, min: .138, max: .390 on the unit scale) between participants gesture proposals, while their thinking time were less similar (min: 2.45 s, max: 22.50 s), which suggests that head and shoulders gestures are not all equally easy to imagine and to produce. We point to the challenges of deciding which head and shoulders gestures will become the consensus set based on four criteria: the agreement rate, their individual frequency, their associative frequency, and their unicity.

Keywords

Gesture elicitation study Gesture interaction 

Notes

Acknowledgements

The first author would like to thank Dr. Teodora Voicu for helping him with anatomy, and Thibaut Jacob, Gilles Bailly, and Eric Lecolinet for providing the images of [8] from which the design space has been drawn.

References

  1. 1.
    Bostan, I., et al.: Hands as a controller: user preferences for hand specific on-skin gestures. In: Proceedings of the ACM International Conference on Designing Interactive Systems (DIS 2018), pp. 1123–1134. ACM, New York (2017).  https://doi.org/10.1145/3064663.3064766
  2. 2.
    Bressem, J., Ladewig, S.H.: Rethinking gesture phases: articulatory features of gestural movement? Semiotica 184, 53–91 (2011).  https://doi.org/10.1515/semi.2011.022CrossRefGoogle Scholar
  3. 3.
    Chen, Z., et al.: User-defined gestures for gestural interaction: extending from hands to other body parts. Int. J. Hum. Comp. Interact. 34(3), 238–250 (2018).  https://doi.org/10.1080/10447318.2017.1342943CrossRefGoogle Scholar
  4. 4.
    Drake, R., Vogl, W., Mitchel, A.W.M.: Gray’s Anatomy for Students, 4th edn. Elsevier, Amsterdam (2019)Google Scholar
  5. 5.
    Felberbaum, Y., Lanir, J.: Step by step: investigating foot gesture interaction. In: Proceedings of the ACM International Working Conference on Advanced Visual Interfaces (AVI 2016), pp. 306–307. ACM, New York (2016)Google Scholar
  6. 6.
    Havlucu, H., Ergin, M.Y., Bostan, İ., Buruk, O.T., Göksun, T., Özcan, O.: It made more sense: comparison of user-elicited on-skin touch and freehand gesture sets. In: Streitz, N., Markopoulos, P. (eds.) DAPI 2017. LNCS, vol. 10291, pp. 159–171. Springer, Cham (2017).  https://doi.org/10.1007/978-3-319-58697-7_11CrossRefGoogle Scholar
  7. 7.
    Hirsch, M., Cheng, J., Reiss, A., Sundholm, M., Lukowicz, P., Amft, O.: Hands-free gesture control with a capacitive textile neckband. In: Proceedings of the ACM International Symposium on Wearable Computers (ISWC 2014), pp. 55–58. ACM, New York (2014).  https://doi.org/10.1145/2634317.2634328
  8. 8.
    Jacob, T., Bailly, G., Lecolinet, E.: A study on 3D viewpoint control through head and shoulders motion. In: Proceedings of the 27th International Conference on Interaction Homme-Machine (IHM 2015), Article 25. ACM, New York (2015).  https://doi.org/10.1145/2820619.2825005
  9. 9.
    Kendall, M.G., Smith, B.B.: The Problem of \(m\) Rankings. Ann. Math. Stat. 10(3), 275–287 (1939). http://www.jstor.org/stable/2235668MathSciNetCrossRefGoogle Scholar
  10. 10.
    Korkman, M., Kirk, U., Kemp, S.: NEPSY: A Developmental Neuropsychological Assessment. Psychological Corporation, San Antonio (1998)Google Scholar
  11. 11.
    Kühnel, C., Westermann, T., Hemmert, F., Kratz, S., Müller, A., Möller, S.: I’m home: defining and evaluating a gesture set for smart-home control. Int. J. Hum. Comput. Stud. 69(11), 693–704 (2011).  https://doi.org/10.1016/j.ijhcs.2011.04.005CrossRefGoogle Scholar
  12. 12.
    Lee, D.Y., Oakley, I.R., Lee, Y.R.: Bodily input for wearables: an elicitation study. In: Extended Abstract of Proceedings of International Conference on Human-Computer Interaction Korea 2016 (HCI Korea 2016), pp. 283–285 (2016). https://www.dbpia.co.kr/Journal/ArticleDetail/NODE06645483
  13. 13.
    Lee, J., Yeo, H.-S., Starner, T., Quigley, A., Kunze, K., Woo, W.: Automated data gathering and training tool for personalized “Itchy Nose”. In: Proceedings of the 9th International Conference on Augmented Human Conference (AH 2018), Article 43. ACM, New York (2018)Google Scholar
  14. 14.
    Lewis, J.R.: IBM computer usability satisfaction questionnaires: psychometric evaluation and instructions for use. Int. J. Hum. Comput. Interact. 7(1), 57–78 (1995).  https://doi.org/10.1080/10447319509526110CrossRefGoogle Scholar
  15. 15.
    Liu, M., Nancel, M., Vogel, D.: Gunslinger: subtle armsdown mid-air interaction. In: Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology (UIST 2015), pp. 63–71. ACM, New York (2015).  https://doi.org/10.1145/2807442.2807489
  16. 16.
    Malu, M., Chundury, P., Findlater, L.: Exploring accessible smartwatch interactions for people with upper body motor impairments. In: Proceedings of the ACM International Conference on Human Factors in Computing Systems (CHI 2018), Paper 488. ACM, New York (2018).  https://doi.org/10.1145/3173574.3174062
  17. 17.
    Mardanbegi, D., Hansen, D.W., Pederson, T.: Eye-based head gestures. In: Proceedings of the ACM Symposium on Eye Tracking Research and Applications (ETRA 2012), pp. 139–146. ACM, New York (2012)Google Scholar
  18. 18.
    McClave, E.Z.: Linguistic functions of head movements in the context of speech. J. Pragmatics 32(7), 855–878 (2000).  https://doi.org/10.1016/S0378-2166(99)00079-XCrossRefGoogle Scholar
  19. 19.
    Piumsomboon, T., Clark, A., Billinghurst, M., Cockburn, A.: User-defined gestures for augmented reality. In: Kotzé, P., Marsden, G., Lindgaard, G., Wesson, J., Winckler, M. (eds.) INTERACT 2013. LNCS, vol. 8118, pp. 282–299. Springer, Heidelberg (2013).  https://doi.org/10.1007/978-3-642-40480-1_18CrossRefGoogle Scholar
  20. 20.
    Ruiz, J., Li, Y., Lank, E.: User-defined motion gestures for mobile interaction. In Proceedings of the ACM International Conference on Human Factors in Computing Systems (CHI 2011), pp. 197–206. ACM, New York (2011).  https://doi.org/10.1145/1978942.1978971
  21. 21.
    Saffer, D.: Designing Gestural Interfaces. O’Reilly Media, Sebastopol (2008)Google Scholar
  22. 22.
    Serrano, M., Ens, B.M., Irani, P.P.: Exploring the use of hand-to-face input for interacting with head-worn displays. In: Proceedings of the 32nd ACM International Conference on Human Factors in Computing Systems (CHI 2014), pp. 3181–3190. ACM, New York (2014).  https://doi.org/10.1145/2556288.2556984
  23. 23.
    Špakov, O., Majaranta, P.: Enhanced gaze interaction using simple head gestures. In: Proceedings of the ACM International Conference on Ubiquitous Computing (UbiComp 2012), pp. 705–710. ACM, New York (2012).  https://doi.org/10.1145/2370216.2370369
  24. 24.
    Vanderdonckt, J.: Accessing guidelines information with Sierra. In: Proceedings of IFIP International Conference on Human-Computer Interaction (Interact 1995), pp. 311–316. IFIP (1995).  https://doi.org/10.1007/978-1-5041-2896-4_52Google Scholar
  25. 25.
    Vanderdonckt, J.: A MDA-compliant environment for developing user interfaces of information systems. In: Pastor, O., Falcão e Cunha, J. (eds.) CAiSE 2005. LNCS, vol. 3520, pp. 16–31. Springer, Heidelberg (2005).  https://doi.org/10.1007/11431855_2CrossRefGoogle Scholar
  26. 26.
    Vanderdonckt, J., Roselli, P., Pérez-Medina, J.L.: !FTL, an articulation-invariant stroke gesture recognizer with controllable position, scale, and rotation invariances. In: Proceedings of the ACM International Conference on Multimodal Interaction (ICMI 2018), pp. 125–134. ACM, New York (2018)Google Scholar
  27. 27.
    Vatavu, R.-D.: User-defined gestures for free-hand TV control. In: Proceedings of the 10th European Conference on Interactive TV and Video (EuroITV 2012), pp. 45–48 (2012).  https://doi.org/10.1145/2325616.2325626
  28. 28.
    Vatavu, R.-D., Wobbrock, J.O.: Formalizing agreement analysis for elicitation studies: new measures, significance test, and toolkit. In: Proceedings of the 33rd ACM Conference on Human Factors in Computing Systems (CHI 2015), pp. 1325–1334. ACM, New York (2015)Google Scholar
  29. 29.
    Vatavu, R.-D., Wobbrock,J.O.: Between-subjects elicitation studies: formalization and tool support. In: Proceedings of the 34th ACM Conference on Human Factors in Computing Systems (CHI 2016), pp. 3390–3402. ACM, New York (2016)Google Scholar
  30. 30.
    Wobbrock, J.O., Morris, M.R., Wilson, A.D.: User-defined gestures for surface computing. In: Proceedings of the ACM International Conference on Human Factors in Computing Systems (CHI 2009), pp. 1083–1092. ACM, New York (2009)Google Scholar
  31. 31.
    Zaiţi, I.-A., Pentiuc, Ş.-G., Vatavu, R.-D.: On free-hand TV control: experimental results on user-elicited gestures with leap motion. Personal Ubiquit. Comput. 19(5–6), 821–838 (2015).  https://doi.org/10.1007/s00779-015-0863-yCrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  • Jean Vanderdonckt
    • 1
    Email author
  • Nathan Magrofuoco
    • 1
  • Suzanne Kieffer
    • 1
  • Jorge Pérez
    • 1
    • 2
  • Ysabelle Rase
    • 1
  • Paolo Roselli
    • 1
    • 3
  • Santiago Villarreal
    • 1
  1. 1.Université Catholique de LouvainLouvain-la-NeuveBelgium
  2. 2.Universidad de las Américas, Intelligent & Interactive Systems Lab (𝑆𝐼² Lab)QuitoEcuador
  3. 3.Matematica DipartimentoUniversità degli Studi di Roma “Tor Vergata”RomeItaly

Personalised recommendations