Advertisement

Realizing Multi-Touch-Like Gestures in 3D Space

  • Chunmeng LuEmail author
  • Li Zhou
  • Jiro Tanaka
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10904)

Abstract

In this paper, our purpose is extending 2D multi-touch interaction to 3D space and presenting a universal multi-touch gestures for 3D space. We described a system that allows people to use their familiar multi-touch gestures in 3D space without touching surface. We called these midair gestures in 3D as 3D multi-touch-like gestures. There is no object or surface for user to touch in 3D space, so we use depth camera to detect fingers’ state and estimate whether finger in the “click down” or “click up”, which show user’s intention to interact with system. We use machine learning to recognize hand shapes. While we do not need to precessing the recognition all the time, we only recognize hand shape between “click down” or “click up”.

Keywords

Gesture Human-computer interaction Machine learning 

References

  1. 1.
    Leap Motion. https://www.leapmotion.com. Accessed 5 Feb 2018
  2. 2.
    Qt. https://www.qt.io. Accessed 5 Feb 2018
  3. 3.
    Boussemart, Y., Rioux, F., Rudzicz, F., Wozniewski, M., Cooperstock, J.R.: A framework for 3D visualisation and manipulation in an immersive space using an untethered bimanual gestural interface. In: Proceedings of the ACM Symposium on Virtual Reality Software and Technology, VRST 2004, pp. 162–165. ACM, New York (2004)Google Scholar
  4. 4.
    Chang, C.C., Lin, C.J.: LIBSVM: a library for support vector machines. ACM Trans. Intell. Syst. Technol. 2(3), 27:1–27:27 (2011)CrossRefGoogle Scholar
  5. 5.
    Garg, P., Aggarwal, N., Sofat, S.: Vision based hand gesture recognition. World Acad. Sci. Eng. Technol. 49(1), 972–977 (2009)Google Scholar
  6. 6.
    Karam, H., Tanaka, J.: Two-handed interactive menu: an application of asymmetric bimanual gestures and depth based selection techniques. In: Yamamoto, S. (ed.) HCI 2014. LNCS, vol. 8521, pp. 187–198. Springer, Cham (2014).  https://doi.org/10.1007/978-3-319-07731-4_19CrossRefGoogle Scholar
  7. 7.
    Karam, H., Tanaka, J.: Finger click detection using a depth camera. Procedia Manuf. 3, 5381–5388 (2015)CrossRefGoogle Scholar
  8. 8.
    Karam, H., Tanaka, J.: An algorithm to detect midair multi-clicks gestures. Inf. Media Technol. 12, 340–351 (2017)Google Scholar
  9. 9.
    LaViola, Jr., J.J.: An introduction to 3D gestural interfaces. In: ACM SIGGRAPH 2014 Courses, SIGGRAPH 2014, 42 p. ACM, New York (2014)Google Scholar
  10. 10.
    Lee, U., Tanaka, J.: Finger identification and hand gesture recognition techniques for natural user interface. In: Proceedings of the 11th Asia Pacific Conference on Computer Human Interaction, pp. 274–279. ACM (2013)Google Scholar
  11. 11.
    Wachs, J.P., Kölsch, M., Stern, H., Edan, Y.: Vision-based hand-gesture applications. Commun. ACM 54(2), 60–71 (2011)CrossRefGoogle Scholar
  12. 12.
    Wilson, A.D.: TouchLight: an imaging touch screen and display for gesture-based interaction. In: Proceedings of the 6th International Conference on Multimodal Interfaces, ICMI 2004, pp. 69–76. ACM, New York (2004)Google Scholar

Copyright information

© Springer International Publishing AG, part of Springer Nature 2018

Authors and Affiliations

  1. 1.Graduate School of Information, Production and SystemWaseda UniversityTokyoJapan

Personalised recommendations