Abstract
This paper presents a virtual dancer that is able to dance to the beat of music coming in through the microphone and to motion beats detected in the video stream of a human dancer. In the current version its moves are generated from a lexicon that was derived manually from the analysis of the video clips of nine rap songs of different rappers. The system also allows for adaptation of the moves in the lexicon on the basis of style parameters.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
Rose, T.: Black Noise: Rap Music and Black Culture in Contemporary America. Reed Business Information, Inc (1994)
Attolino, P.: Iconicity in rap music: the challenge of an anti-language. Presentation at Fifth International Symposium Iconicity in Language and Literature, Kraków (2005)
Ruttkay, Z., Pelachaud, C.: From Brows to Trust. In: Evaluating embodied conversational agents. Kluwer Academic Publishers, Dordrecht (2004)
Agent Culture. In: Payr, S., Trappl, R. (eds.) Human-Agent Interaction in a Multicultural World. Lawrence Erlbaum Associates, Mahwah (2004)
Prendinger, H., Ishizuka, M. (eds.): Life-Like Characters. Tools, Affective Functions, and Applications. Cognitive Technologies Series. Springer, Heidelberg (2004)
Tosa, N., Nakatsu, R.: Emotion recognition-based interactive theatre –Romeo & Juliet in Hades -. In: Alberti, M.A., Gallo, G., Jelinek, I. (eds.) Eurographics 1999 (1999)
Pinhanez, C., Bobick, A.: Using computer vision to control a reactive graphics character in a theater play. In: Christensen, H.I. (ed.) ICVS 1999. LNCS, vol. 1542, pp. 66–82. Springer, Heidelberg (1998)
Shiratori, T., Nakazawa, A., Ikeuchi, K.: Rhythmic motion analysis using motion capture and musical information. In: IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems, pp. 89–94 (2003)
Bertolo, M., Maninetti, P., Marini, D.: Baroque dance animation with virtual dancers. In: Alberti, M.A., Gallo, G., Jelinek, I. (eds.) Eurographics 1999 (1999)
Kragtwijk, M., Nijholt, A., Zwiers, J.: An animated virtual drummer. In: Giagourta, V., Strintzis, M.G. (eds.) International Conference on Augmented, Virtual Environments and Three-dimensional Imaging (ICAV3D), Mykonos, Greece, pp. 319–322 (2001)
Goto, M., Muraoka, Y.: A Virtual Dancer “Cindy” - Interactive Performance of a Music-controlled CG Dancer. In: Proceedings of Lifelike Computer Characters 1996, p. 65 (1996)
Hamanaka, M., Goto, M., Asoh, H., Otsu, N.: A learning-based jam session system that imitates a player’s personality model. In: Proceedings International Joint Conference on Artificial Intelligence, pp. 51–58 (2003)
Friberg, A., Schoonderwaldt, E., Juslin, P.N., Bresin, R.: Automatic real-time extraction of musical expression. In: International Computer Music Conference - ICMC 2002, San Francisco International Computer Music Association, pp. 365–367 (2002)
Mancini, M., Bresin, R., Pelachaud, C.: From acoustic cues to expressive ECAs. In: 6th International Workshop on Gesture in Human-Computer Interaction and Simulation, Valoria, Université de Bretagne Sud, France (2005)
Taylor, R., Torres, D., Boulanger, P.: Using music to interact with a virtual character. In: International Conference on New Interfaces for Musical Expression (NIME 2005), Vancouver, BC, Canada, pp. 220–223 (2005)
Taylor, R., Boulanger, P., Torres, D.: Visualizing emotion in musical performance using a virtual character. In: 5th International Symposium on Smart Graphics, Germany (2005)
Camurri, A., Lagerlöf, I., Volpe, G.: Recognizing Emotion from Dance Movement: Comparison of Spectator Recognition and Automated Techniques. International Journal of Human-Computer Studies 59(1-2), 213–225 (2003)
Camurri, A., Mazzarino, B., Volpe, G.: Analysis of Expressive Gesture: The EyesWeb Expressive Gesture Processing Library. In: Camurri, A., Volpe, G. (eds.) GW 2003. LNCS (LNAI), vol. 2915, pp. 460–467. Springer, Heidelberg (2004)
Piage Day Project: Internal documentation. University of Twente (2004)
Poppe, R., Heylen, D., Nijholt, A., Poel, M.: Towards real-time body pose estimation for presenters in meeting environments. In: International Conference in Central Europe on Computer Graphics, Visualization and Computer Vision, WSCG 2005 (2005)
Nijholt, A., Welbergen, H., Zwiers, J.: Introducing an embodied virtual presenter agent in a virtual meeting room. In: Proceedings of the IASTED International Conference on Artificial Intelligence and Applications (AIA 2005), pp. 579–584 (2005)
Goto, M.: An audio-based real-time beat tracking system for music with or without drum-sounds. Journal of New Music Research 30(2) (2001)
Kim, T.-h., Park, S., Shin, S.Y.: Rhythmic-motion synthesis based on motion-beat analysis. ACM Trans. Graph 22(3), 392–401 (2003)
Kapur, A., Benning, M., Tzanetakis, G.: Query by Beatboxing: Music Information Retrieval for the DJ. In: Proceedings of the International Conference on Music Information Retrieval, Barcelona, Spain (2004)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2005 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Reidsma, D., Nijholt, A., Rienks, R., Hondorp, H. (2005). Interacting with a Virtual Rap Dancer. In: Maybury, M., Stock, O., Wahlster, W. (eds) Intelligent Technologies for Interactive Entertainment. INTETAIN 2005. Lecture Notes in Computer Science(), vol 3814. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11590323_14
Download citation
DOI: https://doi.org/10.1007/11590323_14
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-30509-5
Online ISBN: 978-3-540-31651-0
eBook Packages: Computer ScienceComputer Science (R0)