Comparing head gesture, hand gesture and gamepad interfaces for answering Yes/No questions in virtual environments

  • Jingbo ZhaoEmail author
  • Robert S. Allison
Original Article


A potential application of gesture recognition algorithms is to use them as interfaces to interact with virtual environments. However, the performance and the user preference of such interfaces in the context of virtual reality (VR) have been rarely studied. In the present paper, we focused on a typical VR interaction scenario—answering Yes/No questions in VR systems to compare the performance and the user preference of three types of interfaces. These interfaces included a head gesture interface, a hand gesture interface and a conventional gamepad interface. We designed a memorization task, in which participants were asked to memorize several everyday objects presented in a virtual room and later respond to questions on whether they saw a specific object through the given interfaces when these objects were absent. The performance of the interfaces was evaluated in terms of the real-time accuracy and the response time. A user interface questionnaire was also used to reveal the user preference for these interfaces. The results showed that head gesture is a very promising interface, which can be easily added to existing VR systems for answering Yes/No questions and other binary responses in virtual environments.


Head gesture Hand gesture Virtual reality Usability 



  1. Abate AF, Acampora G, Ricciardi S (2011) An interactive virtual guide for the AR based visit of archaeological sites. J Vis Lang Comput 22:415–425CrossRefGoogle Scholar
  2. Cardoso JCS (2016) Comparison of gesture, gamepad, and gaze-based locomotion for VR worlds. In: Proceedings of the 22nd ACM conference on virtual reality software and technology, pp 319–320Google Scholar
  3. Chang C, Lin C (2011) LIBSVM: a library for support vector machines. ACM Trans Intell Syst Technol 27(2):1–27CrossRefGoogle Scholar
  4. Cheng H, Yang L, Liu Z (2016) Survey on 3D hand gesture recognition. IEEE Trans Circuits Syst Video Technol 26:1659–1673CrossRefGoogle Scholar
  5. Coomer N, Bullard S, Clinton W, Williams B (2018) Evaluating the effects of four VR locomotion methods: joystick, arm-cycling, point-tugging, and teleporting. In: Proceedings of the 15th ACM symposium on applied perception, pp 7:1–7:8Google Scholar
  6. Kitson A, Hashemian AM, Stepanova ER, Kruijff E, Riecke BE (2017) Comparing leaning-based motion cueing interfaces for virtual reality locomotion. In: 2017 IEEE symposium on 3D user interfaces, pp 73–82Google Scholar
  7. Lun R, Zhao W (2015) A survey of applications and human motion recognition with microsoft kinect. Int J Patt Recogn Artif Intell 29:1555008CrossRefGoogle Scholar
  8. Marin G, Dominio F, Zanuttigh P (2016) Hand gesture recognition with jointly calibrated Leap Motion and depth sensor. Multimed Tools Appl 75:14991–15015CrossRefGoogle Scholar
  9. Morency L-P, Sidner C, Lee C, Darrell T (2007) Head gestures for perceptual interfaces: the role of context in improving recognition. Artif Intell 171:568–585CrossRefGoogle Scholar
  10. Morimoto C, Yacoob Y, Davis L (1996) Recognition of head gestures using hidden Markov models. In: Proceedings of 13th international conference on pattern recognition, pp 461–465Google Scholar
  11. Nabiyouni M, Saktheeswaran A, Bowman DA, Karanth A (2015) Comparing the performance of natural, semi-natural, and non-natural locomotion techniques in virtual reality. In: 2015 IEEE symposium on 3D user interfaces, pp 3–10Google Scholar
  12. Rabiner LR (1989) A tutorial on hidden Markov models and selected applications in speech recognition. Proc IEEE 77:257–286CrossRefGoogle Scholar
  13. Robinett W, Holloway R (1992) Implementation of flying, scaling and grabbing in virtual worlds. In: Proceedings of the 1992 symposium on interactive 3D graphics, pp 189–192Google Scholar
  14. Terven JR, Salas J, Raducanu B (2014) Robust head gestures recognition for assistive technology. In: Pattern recognition, pp 152–161Google Scholar
  15. Wille M, Wischniewski S (2015) Influence of head mounted display hardware on performance and strain. In: Proceedings of the HFES annual meetingGoogle Scholar
  16. Yan Z, Lindeman RW, Dey A (2016) Let your fingers do the walking: a unified approach for efficient short-, medium-, and long-distance travel in VR. In: 2016 IEEE symposium on 3D user interfaces (3DUI), pp 27–30Google Scholar
  17. Zhao J, Allison RS (2017) Real-time head gesture recognition on head-mounted displays using cascaded hidden Markov models. In: 2017 IEEE international conference on systems, man, and cybernetics (SMC), pp 2361–2366Google Scholar
  18. Zielasko D, Horn S, Freitag S, Weyers B, Kuhlen TW (2016) Evaluation of hands-free HMD-based navigation techniques for immersive data analysis. In: 2016 IEEE symposium on 3D user interfaces, pp 113–119Google Scholar

Copyright information

© Springer-Verlag London Ltd., part of Springer Nature 2019

Authors and Affiliations

  1. 1.College of Information and Electrical EngineeringChina Agricultural UniversityBeijingChina
  2. 2.Department of Electrical Engineering and Computer ScienceYork UniversityTorontoCanada

Personalised recommendations