Abstract
In the previous chapter we showed how human musicians can benefit from visual and physical cues that are afforded by robotic musicians. Similarly, robotic musicians can benefit by augmenting their own abilities through analyzing visual cues by humans. Like humans, robotic musicians can use vision to anticipate, coordinate and synchronize their music playing with human collaborators.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
Biggs, Janet. “Re: Demos” Message to Richard Savery, 26 October, 2017. Email.
- 2.
Biggs, Janet. “Re: Demos” Message to Richard Savery, 6 November, 2017. Email.
References
Johansson, Birger, and Christian Balkenius. 2006. An experimental study of anticipation in simple robot navigation. In Workshop on anticipatory behavior in adaptive learning systems, 365–378. Springer.
Eyssel, Friederike, Dieta Kuchenbrandt, and Simon Bobinger. 2011. Effects of anticipated human-robot interaction and predictability of robot behavior on perceptions of anthropomorphism. In Proceedings of the 6th international conference on Human-robot interaction, 61–68. ACM.
Gielniak, Michael J., and Andrea L Thomaz. 2011. Generating anticipation in robot motion. In 2011 RO-MAN.
Hoffman, Guy. 2010. Anticipation in human-robot interaction. In 2010 AAAI Spring Symposium Series.
Wang, Zhikun, Christoph H. Lampert, Katharina Mulling, Bernhard Scholkopf, and Jan Peters. 2011. Learning anticipation policies for robot table tennis. In 2011 IEEE/RSJ international conference on intelligent robots and systems (IROS), 332–337. IEEE.
Bradski, Gary, and Adrian Kaehler. 2008. Learning OpenCV: Computer vision with the OpenCV library. O’reilly.
Puckette, Miller S., Miller S. Puckette Ucsd, Theodore Apel, et al. 1998. Real-time audio analysis tools for Pd and MSP.
Ghias, Asif, Jonathan Logan, David Chamberlin, and Brian C. Smith. 1995. Query by humming: Musical information retrieval in an audio database. In Proceedings of the third ACM international conference on Multimedia, 231–236. ACM.
Lewis, Barbara E. 1988. The effect of movement-based instruction on first-and third-graders’ achievement in selected music listening skills. Psychology of Music 16 (2): 128–142.
Mitchell, Robert W., and Matthew C. Gallaher. 2001. Embodying music: Matching music and dance in memory. Music Perception 19 (1): 65–85.
Phillips-Silver, Jessica, and Laurel J. Trainor. 2005. Feeling the beat: Movement influences infant rhythm perception. Science 308 (5727): 1430–1430.
Krumhansl, Carol L, and Diana Lynn Schenck. 1997. Can dance reflect the structural and expressive qualities of music? A perceptual experiment on Balanchine’s choreography of Mozart’s divertimento no. 15. Musicae Scientiae 1 (1): 63–85.
Sievers, Beau, Larry Polansky, Michael Casey, and Thalia Wheatley. 2013. Music and movement share a dynamic structure that supports universal expressions of emotion. Proceedings of the National Academy of Sciences 110 (1): 70–75.
Gazzola, Valeria, Lisa Aziz-Zadeh, and Christian Keysers. 2006. Empathy and the somatotopic auditory mirror system in humans. Current Biology 16 (18): 1824–1829.
Grosse, Ernst. 1897. The beginnings of art, vol. 4. D. Appleton and Company.
Paradiso, Joseph A., and Hu, Eric. 1997. Expressive footwear for computer-augmented dance performance. In First international symposium on wearable computers, 1997. Digest of papers, 165–166. IEEE.
Paradiso, Joseph, and Flavia Sparacino. 1997. Optical tracking for music and dance performance. Optical 3-D measurement techniques IV, 11–18.
Camurri, Antonio, Shuji Hashimoto, Matteo Ricchetti, Andrea Ricci, Kenji Suzuki, Riccardo Trocca, and Gualtiero Volpe. 2000. Eyesweb: Toward gesture and affect recognition in interactive dance and music systems. Computer Music Journal 24 (1): 57–69.
Aylward, Ryan, and Joseph A. Paradiso. Sensemble: A wireless, compact, multi-user sensor system for interactive dance. In Proceedings of the 2006 conference on new interfaces for musical expression, 134–139. IRCAM–Centre Pompidou.
Winkler, Todd. 1998. Motion-sensing music: Artistic and technical challenges in two works for dance. In Proceedings of the international computer music conference.
Samberg, Joshua, Armando Fox, and Maureen Stone. 2002. iClub, an interactive dance club. In ADJUNCT PROCEEDINGS, 73.
Bretan, Mason, and Gil Weinberg. 2014. Chronicles of a robotic musical companion. In Proceedings of the 2014 conference on new interfaces for musical expression. University of London.
Bouguet, Jean-Yves. 2001. Pyramidal implementation of the affine lucas kanade feature tracker description of the algorithm. Intel Corporation 5.
Shi, Jianbo, and Carlo Tomasi. 1994. Good features to track. In Proceedings CVPR’94, 1994 IEEE computer society conference on computer vision and pattern recognition, 593–600. IEEE.
Jehan, Tristan, Paul Lamere, and Brian Whitman. 2010. Music retrieval from everything. In Proceedings of the international conference on Multimedia information retrieval, 245–246. ACM.
Gouyon, Fabien, Anssi Klapuri, Simon Dixon, Miguel Alonso, George Tzanetakis, Christian Uhle, and Pedro Cano. 2006. An experimental comparison of audio tempo induction algorithms. IEEE Transactions on Audio, Speech, and Language Processing 14 (5): 1832–1844.
Sundram, J. 2013. Danceability and energy: Introducing echo nest attributes.
Grunberg, David K., Alyssa M. Batula, Erik M. Schmidt, and Youngmoo E. Kim. Affective gesturing with music mood recognition.
Baker, Simon, and Iain Matthews. 2004. Lucas-kanade 20 years on: A unifying framework. International Journal of Computer Vision 56 (3): 221–255.
Toussaint, Godfried. 2005. The Euclidean algorithm generates traditional musical rhythms. In BRIDGES: Mathematical connections in art, music and science, 1–25.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this chapter
Cite this chapter
Weinberg, G., Bretan, M., Hoffman, G., Driscoll, S. (2020). “Watch and Learn”—Computer Vision for Musical Gesture Analysis. In: Robotic Musicianship. Automation, Collaboration, & E-Services, vol 8. Springer, Cham. https://doi.org/10.1007/978-3-030-38930-7_6
Download citation
DOI: https://doi.org/10.1007/978-3-030-38930-7_6
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-38929-1
Online ISBN: 978-3-030-38930-7
eBook Packages: EngineeringEngineering (R0)