Abstract
This paper proposes a system that controls and generates a humanoid robot’s dance motion in real-time using the timing of beats in musical audio signals. The system tracks changes in tempo and calculate the integration value of a decibel by analyzing audio signals in real-time. It uses the information to add changes to the robot’s dance motion. Beat intervals and the integration value of decibels are used to change the tempo and range of the robot’s dance motion respectively. We propose a method to synchronize dance motion of robot with musical beat, changing the robot’s dance motion interactively according to the input value.
Chapter PDF
Similar content being viewed by others
References
Masataka, G., Yoichi, M.: Real-time Beat Tracking for Drumless Audio Signals –Chord Change Detection for Musical Decisions, Speech Communication, 311–335 (1999)
Masataka, G.: An audio-based real-time beat tracking system for music with or without drum-sounds. Journal of New Music Research 30(2), 159–171 (2001)
Takaaki, S., Atsushi, N., Katsushi, I.: Detecting Dance Motion Structure through Music Analysis. In: IEEE International Conference on Automatic Face and Gesture Recognition, pp. 857–862 (2004)
An Official Site of ASIMO, http://www.honda.co.jp/ASIMO/
Kazuyoshi, Y., Kazuhiro, N., Toyotaka, T., Yuji, H., Hiroshi, T., Kazunori, K., Tetsuya, O., Hiroshi, O.: A Biped Robot that Keeps Steps in Time with Musical Beats while Listening to Music with Its Own Ears. In: International Conference on Intelligent Robots and Systems, pp. 1743–1750 (2007)
Abraham, S., Marcel, J.E.G.: Smoothing and Differentiation of Data by Simplified Least Squares Procedures. Analytical Chemistry 36(8), 1627–1639 (1964)
An Official Site of Audacity, http://audacity.sourceforge.net/
Shinichiro, N., Atsushi, N., Kazuhito, Y., Hirohisa, H., Katsuhi, I.: Generating Whole Body Motions for a Biped Humanoid Robot from Captured Human Dances. In: IEEE International Conference on Robotics and Automation (2003)
Dixon, S.: A Beat Tracking System for Audio Signals. In: Proc. of Diderot Forum on Mathematics and Music, Vienna, Austria (1999)
Simon, D.: A Lightweight Multi-Agent Musical Beat Tracking System. In: AAAI Workshop on Artificial Intelligence and Music (2000)
Shinozaki, K., Oda, Y., Tsuda, S., Nakatsu, R., Iwatani, A.: Study of dance entertainment using robots. In: Pan, Z., Aylett, R.S., Diener, H., Jin, X., Göbel, S., Li, L. (eds.) Edutainment 2006. LNCS, vol. 3942, pp. 473–483. Springer, Heidelberg (2006)
Wama, T., Higuchi, M., Sakamoto, H., Nakatsu, R.: Realization of Tai-chi Motion Using a Humanoid Robot. In: Rauterberg, M. (ed.) ICEC 2004. LNCS, vol. 3166, pp. 14–19. Springer, Heidelberg (2004)
Foote, J.: Content-Based Retrieval of Music and Audio. In: Multimedia Storage and Archiving Systems II, Proceedings of SPIE, pp. 138–147 (1997)
Scheirer, E.D.: Tempo and beat analysis of acoustic musical signals. Journal of Acoust. Soc. Am. 103(1), 588–601 (1997)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2009 IFIP International Federation for Information Processing
About this paper
Cite this paper
Nakahara, N., Miyazaki, K., Sakamoto, H., Fujisawa, T.X., Nagata, N., Nakatsu, R. (2009). Dance Motion Control of a Humanoid Robot Based on Real-Time Tempo Tracking from Musical Audio Signals. In: Natkin, S., Dupire, J. (eds) Entertainment Computing – ICEC 2009. ICEC 2009. Lecture Notes in Computer Science, vol 5709. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-04052-8_4
Download citation
DOI: https://doi.org/10.1007/978-3-642-04052-8_4
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-04051-1
Online ISBN: 978-3-642-04052-8
eBook Packages: Computer ScienceComputer Science (R0)