Advertisement

“Will Use It, Because I Want to Look Cool” A Comparative Study of Simple Computer Interactions Using Touchscreen and In-Air Hand Gestures

  • Vidya Vaidyanathan
  • Daniel Rosenberg
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8511)

Abstract

The Xbox Kinect and now the Leap Motion Controller have brought about a paradigm shift in the way we interact with computers by making the recognition of 3D gestures affordable. Interfaces now understand natural user interfaces, integrating gestures, voice and various other kinds of multi-modal input simultaneously. In this paper we attempted to understand in-air gesturing better. The purpose of the study was to understand differences between touchscreen and in-air gesturing for simple human computer interactions. The comparison of the gestures was done in terms of Muscle effort/fatigue and Frustration, Satisfaction and Enjoyment We have also tried to study the learnability of in-air gesturing. In our research we found that in-air gesturing was significantly superior with respect to muscle effort and fatigue when compared with touchscreens. We also found that in-air gesturing was found to be more fun and preferred because of its “coolness factor”. Lastly, in-air gesturing had a rapid learning curve.

Keywords

HCI Touch Screens in-air gestures ergonomics EMG learnability social acceptability natural user interfaces (NUI) 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Acharya, T., Mitra, S.: Gesture recognition: A survey. IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews 37(3), 311–324 (2007)CrossRefGoogle Scholar
  2. 2.
    Atia, A., Takahashi, S., Tanaka, J.: Smart gesture sticker: Smart hand gestures profiles for daily objects interaction. In: Proceedings of: 9th IEEE/ACIS International Conference on Computer and Information Science, IEEE/ACIS ICIS 2010, Yamagata, Japan, August 18-20 (2010)Google Scholar
  3. 3.
    Baudel, T., Beaudouin-Lafon, M.: Charade: Remote control of objects using free-hand gestures. Communications of the ACM – Special Issue on Computer Augmented Environments, Back to the Real World 36(7) (1993)Google Scholar
  4. 4.
    Bowman, D., McMahan, R., Ragan, E.: Questioning naturalism in 3D user interfaces. Communications of the ACM 55(9), 78–88 (2012)CrossRefGoogle Scholar
  5. 5.
    Christova, P., Kossev, A., Kristev, I., Chichov, V.: Surface EMG recorded bybranched electrodes during sustained muscle activity. J. Electromyogr Kinesiol 9, 263–276 (1999)CrossRefGoogle Scholar
  6. 6.
    Harrison, C.: Meaningful gestures, http://www.economist.com/node/21548486 (retrieved March 3, 2012)
  7. 7.
    Kendon, A.: Gesture: Visible action as utterance, pp. 326–355. Cambridge University Press, Cambridge (2004)Google Scholar
  8. 8.
    Kita, S.: Theoretical issues in nonverbal behaviors. Presentation Slides (2007), retrieved from http://ling75.arts.ubc.ca/cogs//cogs401
  9. 9.
    Kita, S., Essegbey, J.: Pointing left in Ghana: How a taboo on the use of left hand influences gestural practice. Gesture 1(1), 73–95 (2001)CrossRefGoogle Scholar
  10. 10.
    Kita, S., Danzinger, E., Stolz, C.: Cultural Specificity of Spatial Schemas manifested in spontaneous gestures. MIT Press, Cambridge (2001)Google Scholar
  11. 11.
    McQuade, K., Dawson, J., Smidt, G.: Scapulothoracic muscle fatigue associated with alterations in scapulohumeral rhythm kinematicsduring maximum resistive shoulder elevation. JOSPT 28(2), 74–80 (1998)CrossRefGoogle Scholar
  12. 12.
    Morita, H., Hashimoto, S., Ohteru, S.: A Computer Music System that Follows a Human Conductor. IEEE Computer, 44–53 (July 1991)Google Scholar
  13. 13.
    Perzanowski., D., Schultz, A., Adams, W., Marsh, E., Bugajska, M.: Building a multimodal Human-Robot interface (2001)Google Scholar
  14. 14.
    Rico, J., Brewster, S.: Usable gestures for mobile interfaces: Evaluating social acceptability. In: Proceedings of CHI 2010, pp. 887–896 (2010)Google Scholar
  15. 15.
    Ronkainen, S., Hakkila, J., Kaleva, S., Colley, A., Linjama, J.: Tap input as an embedded interaction method for mobile devices. In: Proceedings of TEI 2007, pp. 263–270. ACM Press (2007)Google Scholar
  16. 16.
    Sturman, D.: Whole-hand input, Ph.D thesis, Media Arts & Sciences. MIT Press (1992)Google Scholar
  17. 17.
    Weissman, C., Freeman, W.: Television control by hand gestures. In: IEEE Intl Workshop on Automatic Face and Gesture Recognition (June 1994)Google Scholar
  18. 18.
    Wigdor, D., Wixon, D.: Brave NUI World, 1st edn. Morgan Kaufmann, Burlington (2011)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2014

Authors and Affiliations

  • Vidya Vaidyanathan
    • 1
  • Daniel Rosenberg
    • 1
  1. 1.San Jose State UniversityUSA

Personalised recommendations