Advertisement

Networked Virtual Marionette Theater

  • Daisuke Ninomiya
  • Kohji Miyazaki
  • Ryohei Nakatsu
Conference paper
Part of the IFIP International Federation for Information Processing book series (IFIPAICT, volume 279)

Abstract

This paper describes a system that allows users to control virtual marionette characters based on computer graphics (CG marionette characters) with their hand and finger movements and thus perform a marionette theatrical play. The system consists of several subsystems, and each subsystem consists of a web camera and a PC. It can recognize a hand gesture of its user and transform it into a gesture of a CG marionette character. These subsystems are connected through the Internet, so they can exchange the information of the CG marionette character’s movements at each subsystem and display the movements of all characters throughout the entire system. Accordingly, multiple users can join the networked virtual marionette theater and enjoy the marionette play together.

Keywords

Marionette puppet virtual theater hand gesture image recognition 

References

  1. 1.
    Keene, D. No and Bunraku. (1990). Columbia University Press.Google Scholar
  2. 2.
    http://www.lares.dti.ne.jp/bunraku/index.htmlGoogle Scholar
  3. 3.
    Currell D. Making and Manipulating Marionettes. (2004). The Crowood Press Ltd.Google Scholar
  4. 4.
    http://www.futuremovies.co.uk/review.asp?ID=319Google Scholar
  5. 5.
    Stam, J. and Fiume, E. Depicting Fire and Other Gaseous Phenomena Using Diffusion Process. (1995). In Proceedings of SIGGRAPH’95.Google Scholar
  6. 6.
    O’ Brien, J. F. and Hodgins J. K. Graphical modeling and animation of brittle fracture. (1999). In Proceedings of SIGGRAPH’99.Google Scholar
  7. 7.
    Courty, N. Fast Crowd. (2004). In Proceedings of SIGGRAPH ’ 2004.Google Scholar
  8. 8.
    Boulic N., Thalmann, M. and Thalmann, D. A GLOBAL HUMAN WALKING MODEL WITH REAL-TIME KINEMATIC PERSONIFICATION. (1990). The Visual computer, pp. 344-358.Google Scholar
  9. 9.
    Lee, J. and Lee, K. H. Precomputing avatar behavior from human motion data. (2004). Graphical Models, Vol. 68, No. 2, pp. 158-174.Google Scholar
  10. 10.
    Ninomiya, D., Miyazaki, K. and Nakatsu, R. Study on the CG Marionette Control Based on the Hand Gesture Recognition. (2006). Annual Meeting of Game Society of Japan(in Japanese).Google Scholar
  11. 11.
    Ng, C. W. Real-time gesture recognition system and application. (2002). Image and Vision Computing, pp. 20.Google Scholar
  12. 12.
    Utsumi, A., Ohya, J. and Nakatsu, R. Multiple-camera-based Multiple-hand-gesturetracking. (1999). Transaction of Information Processing Society of Japan, Vol. 40, No. 8, pp. 3143-3154 (in Japanese).Google Scholar
  13. 13.
    Miyazaki, K., Nagai, Y., Wama, T. and Nakatsu, R. Concept and Construction of an Interactive Folktale System. (2007). Entertainment Computing – ICEC2007, Springer LNCS 4740, pp. 162-170.Google Scholar

Copyright information

© IFIP International Federation for Information Processing 2008

Authors and Affiliations

  • Daisuke Ninomiya
    • 1
  • Kohji Miyazaki
    • 1
  • Ryohei Nakatsu
    • 1
    • 2
  1. 1.School of Science and TechnologyKwansei Gakuin University2-1 GakuenJapan
  2. 2.Interactive & Digital Media InstitureNational University of SingaporeSingapore

Personalised recommendations