Advertisement

Gesture-Based Interfaces: Practical Applications of Gestures in Real World Mobile Settings

  • Julie Rico
  • Andrew Crossan
  • Stephen Brewster
Chapter
Part of the Human-Computer Interaction Series book series (HCIS)

Abstract

In the past, the design of gesture-based interfaces has focused on issues of gesture recognition with consideration of social or practical factors that affect the ability of users to perform gestures on the go largely missing. This work describes two important aspects of gestures design for mobile gesture and body-based interaction. First, this paper discusses the social acceptability of using gesture-based interfaces in the variety of locations where mobile interfaces are used. This includes a discussion of a variety of methods that can be used to evaluate social acceptability early on in the development process. Second, this paper discusses the practical implications of creating gesture recognition using accelerometer based sensing given the challenges of gesturing in mobile situations. This includes a discussion of body-based interactions and the scenarios where these might be used successfully.

Keywords

Gesture Recognition Interaction Technique Social Acceptability Multimodal Interface Menu Item 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. 1.
    Bolt, R.A.: “Put-that-there”: voice and gesture at the graphics interface. In: Proceedings of SIGGRAPH 1980, pp. 262–270. ACM Press, New York (1980)Google Scholar
  2. 2.
    Cassell, J.: A framework for gesture generation and interpretation. In: Cipolla, R., Pentland, A. (eds.) Computer Vision in Human–Machine Interaction, pp. 191–215. Cambridge University Press, Cambridge (1998)CrossRefGoogle Scholar
  3. 3.
    Crossan, A., Williamson, J., Brewster, S., Murray-Smith, R.: Wrist rotation for interaction in mobile contexts. In: The Proceedings of Mobile HCI 2008. ACM Press, New YorkGoogle Scholar
  4. 4.
    Crossan, A., McGill, M., Brewster, S.A., Murray-Smith R.: Head tilting for interaction in mobile contexts. In: Proceedings of Mobile HCI 2009. ACM Press, New YorkGoogle Scholar
  5. 5.
    Crossan, A., Ng, A., Brewster, S.: Foot tapping for mobile interaction. In: The Proceedings of BCS HCI, Dundee (2010)Google Scholar
  6. 6.
    Eisenstein, J., Randall, D.: Visual and linguistic information in gesture classification. In: SIGGRAPH: ACM Special Interest Group on Computer Graphics and Interactive Techniques, San Diego. ACM, New York (2007)Google Scholar
  7. 7.
    Fitts, P.M.: The information capacity of the human motor system in controlling the amplitude of movement. J. Exp. Psychol. 47(6), 381–391 (June 1954)CrossRefGoogle Scholar
  8. 8.
    Goffman, E.: The Presentation of Self in Everyday Life. Penguin, London (1990)Google Scholar
  9. 9.
    Jin, Y., Choi, S., et al.: GIA: design of a gesture-based interaction photo album. Pers. Ubiquit. Comput. 8(3), 227–233 (2004)CrossRefGoogle Scholar
  10. 10.
    Kendon, A.: Gesture. Annu. Rev. Anthropol. 26, 109–128 (1997)CrossRefGoogle Scholar
  11. 11.
    Kendon, A.: Current issues in the study of gesture. In: Nespoulous, J.L., Perron, P., Lecour, A.R. (eds.) The Biological Foundations of Gestures. Lawrence Erlbaum Associates, Hillsdale (1986)Google Scholar
  12. 12.
    Kendon, A.: Language and gesture: unity or duality? In: McNeill, D. (ed.) Language and Gesture. Cambridge University Press, Cambridge (2000)Google Scholar
  13. 13.
    Law, E.L., Roto, V., Hassenzahl, M., Vermeeren, A.P., Kort, J.: Understanding, scoping and defining user experience: a survey approach. In: Proceedings of CHI 2009, pp. 719–728.ACM Press, New York (2009)Google Scholar
  14. 14.
    Reeves, S., Benford, S., O’Malley, C., Fraser, M.: Designing the spectator experience. In: Proceedings of CHI 2005, pp. 741–750. ACM Press, New York (2005)Google Scholar
  15. 15.
    Rico, J., Brewster, S.A.: Usable gestures for mobile interfaces: evaluating social acceptability. In: Proceedings of CHI 2010, pp. 887–896. ACM Press, New York (2010)Google Scholar
  16. 16.
    Rico, J., Brewster, S.A.: Gesture and voice prototyping for early evaluations of social acceptability in multimodal interfaces. In Proceedings of ICMI 2010, Beijing. ACM Press, New YorkGoogle Scholar
  17. 17.
    Väänänen, K., Böhm, K.: Gesture driven interaction as a human factor in virtual environments – an approach to neural networks. In: Earnshaw, R.A., Gigante, M.A., Jones, H. (eds.) Virtual Reality Systems. Academic, London (1993)Google Scholar
  18. 18.
    Walker, B. N., Nance, A., Lindsay, J.: Spearcons: speech-based earcons improve navigation performance in auditory menus. In: Proceedings of the 12th International Conference on Auditory Display (ICAD2006). Department of Computer Science, Queen Mary, University of London, London, pp. 63–68 (2006)Google Scholar
  19. 19.
    Wexelblat, A.: Research challenges in gesture: open issues and unsolved problems. In: Wachsmuth, I., Fröhlich, M. (eds.) Proceedings of the International Gesture Workshop, 17–19 September 1997. Lecture Notes in Computer Science, vol. 1371, pp. 1–11. Springer, London (1997)Google Scholar
  20. 20.
    Zhenyao, M., Ulrich, N.: Lexical gesture interface. In: Fourth IEEE International Conference on Computer Vision Systems. IEEE (2006)Google Scholar

Copyright information

© Springer-Verlag London Limited 2011

Authors and Affiliations

  1. 1.Glasgow Interactive Systems Group, School of Computing ScienceUniversity of GlasgowGlasgowUK

Personalised recommendations