Skip to main content

Interacting with a Virtual Rap Dancer

  • Conference paper
  • 1525 Accesses

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 3814))

Abstract

This paper presents a virtual dancer that is able to dance to the beat of music coming in through the microphone and to motion beats detected in the video stream of a human dancer. In the current version its moves are generated from a lexicon that was derived manually from the analysis of the video clips of nine rap songs of different rappers. The system also allows for adaptation of the moves in the lexicon on the basis of style parameters.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Rose, T.: Black Noise: Rap Music and Black Culture in Contemporary America. Reed Business Information, Inc (1994)

    Google Scholar 

  2. Attolino, P.: Iconicity in rap music: the challenge of an anti-language. Presentation at Fifth International Symposium Iconicity in Language and Literature, Kraków (2005)

    Google Scholar 

  3. Ruttkay, Z., Pelachaud, C.: From Brows to Trust. In: Evaluating embodied conversational agents. Kluwer Academic Publishers, Dordrecht (2004)

    Google Scholar 

  4. Agent Culture. In: Payr, S., Trappl, R. (eds.) Human-Agent Interaction in a Multicultural World. Lawrence Erlbaum Associates, Mahwah (2004)

    Google Scholar 

  5. Prendinger, H., Ishizuka, M. (eds.): Life-Like Characters. Tools, Affective Functions, and Applications. Cognitive Technologies Series. Springer, Heidelberg (2004)

    Google Scholar 

  6. Tosa, N., Nakatsu, R.: Emotion recognition-based interactive theatre –Romeo & Juliet in Hades -. In: Alberti, M.A., Gallo, G., Jelinek, I. (eds.) Eurographics 1999 (1999)

    Google Scholar 

  7. Pinhanez, C., Bobick, A.: Using computer vision to control a reactive graphics character in a theater play. In: Christensen, H.I. (ed.) ICVS 1999. LNCS, vol. 1542, pp. 66–82. Springer, Heidelberg (1998)

    Chapter  Google Scholar 

  8. Shiratori, T., Nakazawa, A., Ikeuchi, K.: Rhythmic motion analysis using motion capture and musical information. In: IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems, pp. 89–94 (2003)

    Google Scholar 

  9. Bertolo, M., Maninetti, P., Marini, D.: Baroque dance animation with virtual dancers. In: Alberti, M.A., Gallo, G., Jelinek, I. (eds.) Eurographics 1999 (1999)

    Google Scholar 

  10. Kragtwijk, M., Nijholt, A., Zwiers, J.: An animated virtual drummer. In: Giagourta, V., Strintzis, M.G. (eds.) International Conference on Augmented, Virtual Environments and Three-dimensional Imaging (ICAV3D), Mykonos, Greece, pp. 319–322 (2001)

    Google Scholar 

  11. Goto, M., Muraoka, Y.: A Virtual Dancer “Cindy” - Interactive Performance of a Music-controlled CG Dancer. In: Proceedings of Lifelike Computer Characters 1996, p. 65 (1996)

    Google Scholar 

  12. Hamanaka, M., Goto, M., Asoh, H., Otsu, N.: A learning-based jam session system that imitates a player’s personality model. In: Proceedings International Joint Conference on Artificial Intelligence, pp. 51–58 (2003)

    Google Scholar 

  13. Friberg, A., Schoonderwaldt, E., Juslin, P.N., Bresin, R.: Automatic real-time extraction of musical expression. In: International Computer Music Conference - ICMC 2002, San Francisco International Computer Music Association, pp. 365–367 (2002)

    Google Scholar 

  14. Mancini, M., Bresin, R., Pelachaud, C.: From acoustic cues to expressive ECAs. In: 6th International Workshop on Gesture in Human-Computer Interaction and Simulation, Valoria, Université de Bretagne Sud, France (2005)

    Google Scholar 

  15. Taylor, R., Torres, D., Boulanger, P.: Using music to interact with a virtual character. In: International Conference on New Interfaces for Musical Expression (NIME 2005), Vancouver, BC, Canada, pp. 220–223 (2005)

    Google Scholar 

  16. Taylor, R., Boulanger, P., Torres, D.: Visualizing emotion in musical performance using a virtual character. In: 5th International Symposium on Smart Graphics, Germany (2005)

    Google Scholar 

  17. Camurri, A., Lagerlöf, I., Volpe, G.: Recognizing Emotion from Dance Movement: Comparison of Spectator Recognition and Automated Techniques. International Journal of Human-Computer Studies 59(1-2), 213–225 (2003)

    Article  Google Scholar 

  18. Camurri, A., Mazzarino, B., Volpe, G.: Analysis of Expressive Gesture: The EyesWeb Expressive Gesture Processing Library. In: Camurri, A., Volpe, G. (eds.) GW 2003. LNCS (LNAI), vol. 2915, pp. 460–467. Springer, Heidelberg (2004)

    Chapter  Google Scholar 

  19. Piage Day Project: Internal documentation. University of Twente (2004)

    Google Scholar 

  20. Poppe, R., Heylen, D., Nijholt, A., Poel, M.: Towards real-time body pose estimation for presenters in meeting environments. In: International Conference in Central Europe on Computer Graphics, Visualization and Computer Vision, WSCG 2005 (2005)

    Google Scholar 

  21. Nijholt, A., Welbergen, H., Zwiers, J.: Introducing an embodied virtual presenter agent in a virtual meeting room. In: Proceedings of the IASTED International Conference on Artificial Intelligence and Applications (AIA 2005), pp. 579–584 (2005)

    Google Scholar 

  22. Goto, M.: An audio-based real-time beat tracking system for music with or without drum-sounds. Journal of New Music Research 30(2) (2001)

    Google Scholar 

  23. Kim, T.-h., Park, S., Shin, S.Y.: Rhythmic-motion synthesis based on motion-beat analysis. ACM Trans. Graph 22(3), 392–401 (2003)

    Article  Google Scholar 

  24. Kapur, A., Benning, M., Tzanetakis, G.: Query by Beatboxing: Music Information Retrieval for the DJ. In: Proceedings of the International Conference on Music Information Retrieval, Barcelona, Spain (2004)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2005 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Reidsma, D., Nijholt, A., Rienks, R., Hondorp, H. (2005). Interacting with a Virtual Rap Dancer. In: Maybury, M., Stock, O., Wahlster, W. (eds) Intelligent Technologies for Interactive Entertainment. INTETAIN 2005. Lecture Notes in Computer Science(), vol 3814. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11590323_14

Download citation

  • DOI: https://doi.org/10.1007/11590323_14

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-30509-5

  • Online ISBN: 978-3-540-31651-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics