Advertisement

The Study of the Full Cycle of Gesture Interaction, The Continuum between 2D and 3D

  • Mohamed-Ikbel Boulabiar
  • Gilles Coppin
  • Franck Poirier
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8511)

Abstract

The goal of HCI researchers is to make interaction with computer interfaces simpler, efficient and more natural. In a context of object manipulation, we think that reaching this goal requires the ability to predict and recognize how humans grasp then manipulate objects. This is based on studies explaining human vision, reach, grasp taxonomies and manipulations. In this paper, we study the full cycle of gesture interaction using different points of view, then attempt to organize them using Norman’s theory of Human Action, we link the psychology of object sensing to HCI goals and propose a simplification of gestures classes into four principal families. Our simplification of gestures classes still allow the expression of more detailed subclasses differentiated by the gesture properties.

Keywords

Gesture 3D Interaction Hand Grasping 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Aglioti, S., DeSouza, J.F., Goodale, M.A.: Size-contrast illusions deceive the eye but not the hand. Current Biology: CB 5(6), 679–685 (1995)CrossRefGoogle Scholar
  2. 2.
    Beaudouin-Lafon, M.: Instrumental interaction: An interaction model for designing post-wimp user interfaces. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 446–453. ACM (2000)Google Scholar
  3. 3.
    Bérard, F., Ip, J., Benovoy, M., El-Shimy, D.: Did Minority Report get it wrong? Superiority of the mouse over 3D input devices in a 3D placement task. IFIP (2009)Google Scholar
  4. 4.
    Bertranne, D.: Praxies idéomotrices corporelles: Création d’un test d’imitation de postures asymboliques (2007)Google Scholar
  5. 5.
    Bruno, N., Bernardis, P.: Dissociating perception and action in Kanizsa’s compression illusion. Psychonomic Bulletin & Review 9(4), 723–730 (2002)CrossRefGoogle Scholar
  6. 6.
    Bullock, I., Ma, R., Dollar, A.: A Hand-Centric Classification of Human and Robot Dexterous Manipulation. Ieeexplore.ieee.org, section III, 1–16 (2012)Google Scholar
  7. 7.
    Bullock, I.M., Dollar, A.M.: Classifying human manipulation behavior. In: 2011 IEEE International Conference on Rehabilitation Robotics (ICORR), pp. 1–6. IEEE (2011)Google Scholar
  8. 8.
    Cutkosky, M.: On grasp choice, grasp models, and the design of hands for manufacturing tasks. IEEE Transactions on Robotics and Automation 5(3), 269–279 (1989)CrossRefMathSciNetGoogle Scholar
  9. 9.
    Ehrsson, H.H., Fagergren, A., Jonsson, T., Westling, G., Johansson, R.S., Forssberg, H.: Cortical activity in precision-versus power-grip tasks: An fmri study. Journal of Neurophysiology 83(1), 528–536 (2000)Google Scholar
  10. 10.
    Feix, T., Pawlik, R., Schmiedmayer, H.-B., Romero, J., Kragic, D.: A comprehensive grasp taxonomy. In: Robotics, Science and Systems: Workshop on Understanding the Human Hand for Advancing Robotic Manipulation, pp. 2–3 (2009)Google Scholar
  11. 11.
    Frohlich, B., Tramberend, H., Beers, A., Agrawala, M., Baraff, D.: Physically-based manipulation on the responsive workbench. In: Proceedings of the IEEE Virtual Reality, pp. 5–11. IEEE (2000)Google Scholar
  12. 12.
    Gamberini, L., Spagnolli, A., Prontu, L., Furlan, S., Martino, F., Solaz, B.R., Alcañiz, M., Lozano, J.A.: How natural is a natural interface? An evaluation procedure based on action breakdowns. Personal and Ubiquitous Computing (October 2011)Google Scholar
  13. 13.
    Gustafson, S., Bierwirth, D., Baudisch, P.: Imaginary interfaces: Spatial interaction with empty hands and without visual feedback. In: Proceedings of the 23nd Annual ACM Symposium on User Interface Software and Technology, pp. 3–12. ACM (2010)Google Scholar
  14. 14.
    Harrison, C., Schwarz, J., Hudson, S.E.: Tapsense: Enhancing finger interaction on touch surfaces. In: Proceedings of the 24th Annual ACM Symposium on User Interface Software and Technology, pp. 627–636. ACM (2011)Google Scholar
  15. 15.
    Hilliges, O., Izadi, S., Wilson, A., Hodges, S., Garcia-Mendoza, A.,, B.: Interactions in the air: Adding further depth to interactive tabletops. In: Proceedings of the 22nd Annual ACM Symposium on User Interface Software and Technology, pp. 139–148. ACM (2009)Google Scholar
  16. 16.
    Iberall, T., Bingham, G., Arbib, M.: Opposition space as a structuring concept for the analysis of skilled hand movements. Experimental Brain Research Series (1986)Google Scholar
  17. 17.
    MacKenzie, C.L.C., Iberall, T.: The grasping hand. Elsevier (1994)Google Scholar
  18. 18.
    Malacria, S., Lecolinet, E., Guiard, Y.: Clutch-free panning and integrated pan-zoom control on touch-sensitive surfaces: The cyclostar approach. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI 2010, pp. 2615–2624. ACM, New York (2010)Google Scholar
  19. 19.
    Marzke, M.W., Wullstein, K.L.: Chimpanzee and human grips: A new classification with a focus on evolutionary morphology. International Journal of Primatology 17(1), 117–139 (1996)CrossRefGoogle Scholar
  20. 20.
    McNeill, D.: Gesture and thought. University of Chicago Press (2008)Google Scholar
  21. 21.
    Mine, M.R., Brooks, J. F.P., Sequin, C.H.: Moving objects in space: Exploiting proprioception in virtual-environment interaction. In: Proceedings of the 24th Annual Conference on Computer Graphics and Interactive Techniques, SIGGRAPH 1997, pp. 19–26. ACM Press/Addison-Wesley Publishing Co., New York (1997)Google Scholar
  22. 22.
    Nacenta, M.A., Kamber, Y., Qiang, Y., Kristensson, P.O.: Memorability of pre-designed and user-defined gesture sets. In: CHI, pp. 1099–1108 (2013)Google Scholar
  23. 23.
    Napier, J., Tuttle, R.: Hands. Natural science. Princeton University Press (1993)Google Scholar
  24. 24.
    Napier, J.J.: The prehensile movements of the human hand. Surger 38(4), 902–913 (1956)Google Scholar
  25. 25.
    Norman, D.: The design of everyday things (2002)Google Scholar
  26. 26.
    Norman, D.: Natural user interfaces are not natural. Interactions, 6–10 (2010)Google Scholar
  27. 27.
    Norman, D.A., Draper, S.W.: User Centered System Design; New Perspectives on Human-Computer Interaction. L. Erlbaum Associates Inc., Hillsdale (1986)Google Scholar
  28. 28.
    Nowak, D., Hermsdörfer, J.: Sensorimotor Control of Grasping: Physiology and Pathophysiology. Cambridge University Press (2009)Google Scholar
  29. 29.
    Paulignan, Y., MacKenzie, C., Marteniuk, R., Jeannerod, M.: Selective perturbation of visual input during prehension movements. Experimental Brain Research 83(3), 502–512 (1991)CrossRefGoogle Scholar
  30. 30.
    Reisman, J.L., Davidson, P.L., Han, J.Y.: A screen-space formulation for 2D and 3D direct manipulation. In: Proceedings of the 22nd Annual ACM Symposium on User Interface Software and Technology, UIST 2009, p. 69 (2009)Google Scholar
  31. 31.
    Roberts, A.I., Vick, S.-J., Roberts, S.G.B., Menzel, C.R.: Chimpanzees modify intentional gestures to coordinate a search for hidden food. Nature Communications 5 (2014)Google Scholar
  32. 32.
    Roffman, I., Savage-Rumbaugh, S., Rubert-Pugh, E., Ronen, A., Nevo, E.: Stone tool production and utilization by bonobo-chimpanzees (pan paniscus). Proceedings of the National Academy of Sciences 109(36), 14500–14503 (2012)CrossRefGoogle Scholar
  33. 33.
    Sève-Ferrieu, N.: Neuropsychologie corporelle, visuelle et gestuelle: Du trouble à la rééducation. Elsevier Masson (2005)Google Scholar
  34. 34.
    Valkov, D.: Interscopic Multi-Touch Environments. dfki.de, 339–342 (2010)Google Scholar
  35. 35.
    Victor, B.: A Brief Rant on the Future of Interaction Design (2011), http://worrydream.com/ABriefRantOnTheFutureOfInteractionDesign/
  36. 36.
    Vinayavekhin, P.: Dexterous manipulation planning from human demonstration. PhD thesis, University of Tokyo (2009)Google Scholar
  37. 37.
    Wimmer, R.: Grasp sensing for human-computer interaction. In: Proceedings of the Fifth International Conference on Tangible, Embedded, and Embodied Interaction, pp. 221–228. ACM (2011)Google Scholar
  38. 38.
    Wing, A., Haggard, P.: Hand and brain: The neurophysiology and psychology of hand movementsGoogle Scholar
  39. 39.
    Wobbrock, J.O., Morris, M.R., Wilson, A.D.: User-defined gestures for surface computing. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI 2009, pp. 1083–1092. ACM, New York (2009)Google Scholar
  40. 40.
    Xu, J., Gannon, P.J., Emmorey, K., Smith, J.F., Braun, A.R.: Symbolic gestures and spoken language are processed by a common neural system. Proceedings of the National Academy of Sciences 106(49), 20664–20669 (2009)CrossRefGoogle Scholar
  41. 41.
    Yee, W.: Potential limitations of multi-touch gesture vocabulary: Differentiation, adoption, fatigue. In: Jacko, J.A. (ed.) HCI International 2009, Part II. LNCS, vol. 5611, pp. 291–300. Springer, Heidelberg (2009)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2014

Authors and Affiliations

  • Mohamed-Ikbel Boulabiar
    • 1
  • Gilles Coppin
    • 1
  • Franck Poirier
    • 2
  1. 1.Lab-STICCTelecom BretagneFrance
  2. 2.Lab-STICCUniversity of Bretagne-SudFrance

Personalised recommendations