Body Gestures for Office Desk Scenarios

  • Radu-Daniel Vatavu
  • Ovidiu-Ciprian Ungurean
  • Stefan-Gheorghe Pentiuc
Part of the Human-Computer Interaction Series book series (HCIS)


Gestures have been used in interfaces within a large variety of scenarios: from mobile users that interact with their smart phones by using touch gestures up to the most recent game technology that acquires 3D movements of the whole player’s body. Different contexts lead to different acquisition technologies, gesture vocabularies, and applications. We discuss in this chapter gesture-based interfaces for office desk scenarios by taking into account the constraints of the workspace that limit the available range of body motions. The focus is therefore on hands and head movements captured using non invasive computer vision techniques. We review existing works in order to spot common findings and designs for such a working scenario as well as to understand how gestures can fit into the everyday office desk environment. Several application scenarios are discussed by considering criteria such as the intuitiveness, ease-of-use, and the similarity of proposed interactions with real-world actions.


Motion Sickness Interaction Technique Head Tracking Gesture Command Interaction Metaphor 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.



This paper was supported by the project “Progress and development through post-doctoral research and innovation in engineering and applied sciences- PRiDE - Contract no. POSDRU/89/1.5/S/57083”, project co-funded from European Social Fund through Sectorial Operational Program Human Resources 2007–2013.


  1. 1.
    Balakrishnan, R., Baudel, T., Kurtenbach, G., Fitzmaurice, G.: The rockin’ mouse: integral 3D manipulation on a plane. In: Pemberton, S. (ed.) Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Atlanta, 22–27 March 1997, CHI ’97, pp. 311–318. ACM, New York (1997)Google Scholar
  2. 2.
    Block, F., Gellersen, H., Villar, N.: Touch-display keyboards: transforming keyboards into interactive surfaces. In: Proceedings of the 28th International Conference on Human Factors in Computing Systems, Atlanta, 10–15 April 2010, CHI ’10, pp. 1145–1154. ACM, New York (2010)Google Scholar
  3. 3.
    Bradski, G.R., Davis, J.W.: Motion segmentation and pose recognition with motion history gradients. Mach. Vis. Appl. 13, 174–184 (2002)CrossRefGoogle Scholar
  4. 4.
    Cao, X., Wilson, A., Balakrishnan, R., Hinckley, K., Hudson, S.: Shapetouch: leveraging contact shape on interactive surfaces. In: Proceedings of the 3rd IEEE International Workshop on Horizontal Interactive Human-Computer Systems (Tabletop 2008), Amsterdam, 1–3 October 2008Google Scholar
  5. 5.
    Dietz, P.H., Eidelson, B., Westhues, J., Bathiche, S.: A practical pressure sensitive computer keyboard. In: Proceedings of the 22nd Annual ACM Symposium on User Interface Software and Technology, Victoria 04–07 October 2009, UIST ’09. ACM,(2009)Google Scholar
  6. 6.
    Engelbart, D.C., William, K.: English, a research center for augmenting human intellect. In: AFIPS conference proceedings of the 1968 Fall Joint Computer Conference, San Francisco, December 1968, Vol. 33, pp. 395–410 (1968).
  7. 7.
    Grandjean, E.: Ergonomics In Computerized Offices. Taylor & Francis, London (1987)Google Scholar
  8. 8.
    Hinckley, K., Sinclair, M., Hanson, E., Szeliski, R., Conway, M.: The videomouse: a camera-based multi-degree-of-freedom input device. In: Proceedings of the 12th annual ACM symposium on User Interface Software and Technology, Asheville, 07–10 November 1999, UIST ’99, pp. 103–112. ACM, New York (1999)Google Scholar
  9. 9.
    Hong, S., Thong, J.Y., Moon, J., Tam, K.: Understanding the behavior of mobile data services consumers. Inf. Syst. Front. 10(4), 431–445 (2008)CrossRefGoogle Scholar
  10. 10.
    Jones, M.J., Rehg, J.M.: Statistical color models with application to skin detection. Technical Report 98/11, Cambridge Research Laboratory (1998)Google Scholar
  11. 11.
    Kato, H., Billinghurst, M.: Developing AR applications with ARToolKit. In: Proceedings of the 3rd IEEE/ACM International Symposium on Mixed and Augmented Reality (ISMAR’04), 02–05 November 2004, p. 305. IEEE Computer Society, Washington, DC (2004)Google Scholar
  12. 12.
    Kolsch, M., Turk, M.: Fast 2d hand tracking with flocks of features and multi-cue integration. In: Proceedings of the IEEE Workshop on Real-Time Vision for Human–Computer Interaction. IEEE Computer Society Conference on Computer Vision and Pattern Recognition, IEEE (2004)Google Scholar
  13. 13.
    Lee, J.C.: Hacking the nintendo wii remote. IEEE Pervasive Comput. 7(3), 39–45 (2008)CrossRefGoogle Scholar
  14. 14.
    Lueder, R.: Ergonomics of seated movement. A review of the scientific literature. Humanics ErgoSystems, Inc.
  15. 15.
    Malik, S., Laszlo, J.: Visual touchpad: a two-handed gestural input device. In: Proceedings of the 6th International Conference on Multimodal Interfaces, State College, 13–15 October 2004, ICMI ’04, pp. 289–296. ACM, New York (2004)Google Scholar
  16. 16.
    Moeslund, T.B., Hilton, A., Krüger, V.: A survey of advances in vision-based human motion capture and analysis. Comput. Vis. Image Underst. 104(2), 90–126 (2006)CrossRefGoogle Scholar
  17. 17.
    Nielsen, M., Storring, M., Moeslund, T.B., Granum, E.: A procedure for developing intuitive and ergonomic gesture interfaces for HCI. In: Proceedings of GW’2003: Gesture-Based Communication in Human–Computer Interaction. Lecture Notes in Computer Science, vol. 2915, pp. 409–420. Springer, Berlin (2003)Google Scholar
  18. 18.
    Pentiuc, S.G., Vatavu, R.D., Ungurean, C.O., Cerlinca, T.I.: Techniques for interacting by gestures with information systems. In: Proceedings of the European Conference on the Use of Modern Information and Communication Technologies (ECUMICT ’08), Gent, 14–18 March 2008Google Scholar
  19. 19.
    Poppe, R.: Vision-based human motion analysis: an overview. Comput. Vis. Image Underst. 108(1–2), 4–18 (2007)CrossRefGoogle Scholar
  20. 20.
    Porta, M., Ravarelli, A., Spagnoli, G.: ceCursor, a contextual eye cursor for general pointing in windows environments. In: Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications, Austin, 22–24 March 2010, ETRA ’10. ACM, New York (2010)Google Scholar
  21. 21.
    Springer, T.: The future of ergonomic office seating. Knoll Workplace Research, Knoll Inc.
  22. 22.
    Ungurean, C.O., Pentiuc, S.G., Vatavu, R.D.: Use your head: an interface for computer games using head gestures. In: Proceedings of the 8th International Gesture Workshop (GW’09), Gesture in Embodied Communication and Human–Computer Interaction, Bielefeld (2009)Google Scholar
  23. 23.
    Van Laerhoven, K., Villar, N., Schmidt, A., Kortuem, G., Gellersen, H.: Using an autonomous cube for basic navigation and input. In: Proceedings of the 5th International Conference on Multimodal interfaces (ICMI ’03), Vancouver, 05–07 November 2003, pp. 203–210. ACM, New York (2003)Google Scholar
  24. 24.
    Vatavu, R.D., Grisoni, L., Pentiuc, S.G.: Gesture recognition based on elastic deformation energies. In: Dias, M.S., Gibet, S., Wanderley, M.M. Bastos, R. (eds) Gesture-Based Human–Computer Interaction and Simulation, Lecture Notes in Computer Science, vol. 5085, pp. 1–12. Springer, Berlin (2009)Google Scholar
  25. 25.
    Vatavu, R.D., Grisoni, L., Pentiuc, S.G.: Multiscale detection of gesture patterns in continuous motion trajectories. In: Kopp, S., Wachsmuth, I. (eds) Gesture in Embodied Communication and Human–Computer Interaction, Lecture Notes in Computer Science, vol. 5934, pp. 85–97. Springer, Berlin/Heidelberg (2010)Google Scholar
  26. 26.
    Vatavu, R.D., Pentiuc, S.G.: Multi-level representation of gesture as command for human–computer interaction. Comput. Inform. 27(6), 837–851 (2008). Slovak Academy of Sciences, BratislavaGoogle Scholar
  27. 27.
    Vatavu, R.D., Pentiuc, S.G., Cerlinca, T.I.: Bringing context into play: supporting game interaction through real-time context acquisition. In: Proceedings of the Workshop on Multimodal Interfaces in Semantic Interaction at ICMI 2007 (WMISI’07), Nagoya, 15 November 2007, pp. 3–8. ACM, New York (2007)Google Scholar
  28. 28.
    Vatavu R.D., Pentiuc, S.G., Chaillou, C., Grisoni, L., Degrande, S.: Visual recognition of hand postures for interacting with virtual environments. In: Proceedings of the 8th International Conference on Development and Application Systems (DAS’06), Suceava,, pp. 477–482 (2006)Google Scholar
  29. 29.
    Villar, N., Izadi, S., Rosenfeld, D., Benko, H., Helmes, J., Westhues, J., Hodges, S., Ofek, E., Butler, A., Cao, X., Chen, B.: Mouse 2.0: multi-touch meets the mouse. In: Proceedings of the 22nd Annual ACM Symposium on User Interface Software and Technology, Victoria, 04–07 October 2009, UIST ’09, pp. 33–42. ACM, New York (2009)Google Scholar
  30. 30.
    Wigdor, D., Leigh, D., Forlines, C., Shipman, S., Barnwell, J., Balakrishnan, R., Shen, C.: Under the table interaction. In: Proceedings of the 19th Annual ACM Symposium on User Interface Software and Technology (UIST ’06), Montreux, 15–18 October 2006, pp. 259–268. ACM, New York (2006)Google Scholar
  31. 31.
    Wilson, A.: Robust vision-based detection of pinching for one and two-handed gesture input. In: Proceedings of the 19th Symposium on User Interface Software and Technology (UIST ’06), Montreux, 15–18 October 2006, pp. 255–258. ACM Press, New York (2006)Google Scholar
  32. 32.
    Wilson, A. D., Izadi, S., Hilliges, O., Garcia-Mendoza, A., Kirk, D.: Bringing physics to the surface. In: Proceedings of the 21st Annual ACM Symposium on User Interface Software and Technology (UIST ’08), Monterey, 19–22 October 2008, pp. 67–76. ACM, New York (2008)Google Scholar
  33. 33.
    Wang, R.Y., Popovic’, J.: Real-time hand-tracking with a color glove. In: Hoppe, H. (ed.) ACM SIGGRAPH 2009 Papers, New Orleans, 03–07 August 2009, pp. 1–8. ACM, New York (2009)Google Scholar
  34. 34.
    Zhang, L., Helander, M.G., Drury, C.G.: Identifying factors of comfort and discomfort in sitting. Hum. Factors 38(3), 377–389 (1996)CrossRefGoogle Scholar
  35. 35.
    Zhang, L., Zhou, F., Li, W., Yang, X.: Human–computer interaction system based on nose tracking. In: Proceedings of the 12th International Conference on Human–Computer Interaction: Intelligent Multimodal Interaction Environments, Beijing, 22–27 July 2007, Lecture Notes in Computer Science, pp. 769–778. Springer, Berlin (2007)Google Scholar
  36. 36.
    Zhang, X., Ren, X., Zha, H.: Improving eye cursor’s stability for eye pointing tasks. In: Proceeding of the Twenty-Sixth Annual SIGCHI Conference on Human Factors in Computing Systems, Florence, 05–10 April 2008, CHI ’08, pp. 525–534. ACM, New York (2008)Google Scholar

Copyright information

© Springer-Verlag London Limited 2011

Authors and Affiliations

  • Radu-Daniel Vatavu
    • 1
  • Ovidiu-Ciprian Ungurean
    • 1
  • Stefan-Gheorghe Pentiuc
    • 1
  1. 1.University Stefan cel Mare of SuceavaSuceavaRomania

Personalised recommendations