Advertisement

The Common Characteristics of User-Defined and Mid-Air Gestures for Rotating 3D Digital Contents

  • Li-Chieh Chen
  • Yun-Maw ChengEmail author
  • Po-Ying Chu
  • Frode Eika Sandnes
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9738)

Abstract

Recently, the technology of mid-air gestures for manipulating 3D digital contents has become an important research issue. In order to conform to the needs of users and contexts, eliciting user-defined gestures is inevitable. However, it was reported that user-defined hand gestures tended to vary significantly in posture, motion and speed, making it difficult to identify common characteristics. In this research, the authors conducted an experiment to study the intuitive hand gestures for controlling the rotation of 3D digital furniture. Twenty graduate students majored in Industrial Design were invited to participate in the task. Although there were great varieties among different participants, common characteristics were extracted through systematic behavior coding and analysis. The results indicated that open palm and D Handshape (American Sign Language) were the most intuitive hand poses. In addition, moving hands along the circumference of a horizontal circle was the most intuitive hand motion and trajectory.

Keywords

Mid-air gesture User-defined gesture 3D digital content rotation 

Notes

Acknowledgement

The authors would like to express our gratitude to the Ministry of Science and Technology of the Republic of China for financially supporting this research under Grant No. MOST 104-2221-E-036-020.

References

  1. 1.
    O’Hara, K., Gonzalez, G., Sellen, A., Penney, G., Varnavas, A., Mentis, H., Criminisi, A., Corish, R., Rouncefield, M., Dastur, N., Carrell, T.: Touchless interaction in surgery. Commun. ACM 57(1), 70–77 (2014)CrossRefGoogle Scholar
  2. 2.
    Rosa, G.M., Elizondo, M.L.: Use of a gesture user interface as a touchless image navigation system in dental surgery: Case series report. Imaging Sci. Dent. 44, 155–160 (2014)CrossRefGoogle Scholar
  3. 3.
    Hettig, J., Mewes, A., Riabikin, O., Skalej, M., Preim, B., Hansen, C.: Exploration of 3D medical image data for interventional radiology using myoelectric gesture control. In: 2015 Eurographics Workshop on Visual Computing for Biology and Medicine (2015)Google Scholar
  4. 4.
    Hsu, F.-S., Lin, W.-Y.: A multimedia presentation system using a 3D gesture interface in museums. Multimedia Tools Appl. 69(1), 53–77 (2014)CrossRefGoogle Scholar
  5. 5.
    Ackad, C., Clayphan, A., Tomitsch, M., Kay, J.: An in-the-wild study of learning mid-air gestures to browse hierarchical information at a large interactive public display. In: UBICOMP 2015, 7–11 September 2015, Osaka, Japan (2015)Google Scholar
  6. 6.
    Vinayak, Ramani, K.: A gesture-free geometric approach for mid-air expression of design intent in 3D virtual pottery. Comput. Aided Des. 69, 11–24 (2015)CrossRefGoogle Scholar
  7. 7.
    Nancel, M., Wagner, J., Pietriga, E., Chapuis, O., Mackay, W.: Mid-air pan-and-zoom on wall-sized displays. In: Proceedings of the SIGCHI Conference on Human Factors and Computing Systems, CHI 2011, May 2011, Vancouver, Canada, pp. 177–186 (2011)Google Scholar
  8. 8.
    Aigner, R., Wigdor, D., Benko, H., Haller, M., Lindlbauer, D., Ion, A., Zhao, S., Koh, J.T.K.V.: Understanding Mid-Air Hand Gestures: A Study of Human Preferences in Usage of Gesture Types for HCI, Microsoft Research Technical Report MSR-TR-2012-111 (2012). http://research.microsoft.com/apps/pubs/default.aspx?id=175454
  9. 9.
    LaViola Jr., J.J.: 3D gestural interaction: the state of the field. ISRN Artif. Intell. 2013, 1–18 (2013). Article ID 514641CrossRefGoogle Scholar
  10. 10.
    Pereira, A., Wachs, J.P., Park, K., Rempel, D.: A user-developed 3-D hand gesture set for human-computer interaction. Hum. Factors 57(4), 607–621 (2015)CrossRefGoogle Scholar
  11. 11.
    Choi, E., Kim, H., Chung, M.K.: A taxonomy and notation method for three-dimensional hand gestures. Int. J. Ind. Ergon. 44(1), 171–188 (2014)CrossRefGoogle Scholar
  12. 12.
    Pisharady, P.K., Saerbeck, M.: Recent methods and databases in vision-based hand gesture recognition: A review. Comput. Vis. Image Underst. 141, 152–165 (2015). Pose & GestureCrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2016

Authors and Affiliations

  • Li-Chieh Chen
    • 1
  • Yun-Maw Cheng
    • 2
    Email author
  • Po-Ying Chu
    • 1
  • Frode Eika Sandnes
    • 3
  1. 1.Department of Industrial DesignTatung UniversityTaipeiTaiwan
  2. 2.Graduate Institute of Design Science, Department of Computer Science and EngineeringTatung UniversityTaipeiTaiwan
  3. 3.Oslo and Akershus University College of Applied SciencesOsloNorway

Personalised recommendations