Skip to main content

“Be Social”—Embodied Human-Robot Musical Interactions

  • Chapter
  • First Online:
Robotic Musicianship

Part of the book series: Automation, Collaboration, & E-Services ((ACES,volume 8))

Abstract

Embodiment has a significant effect on social human-robot interaction, from enabling fluent turn-taking between humans and robots [1] to humans’ positive perception of robotic conversants [2]. In Robotic Musicianship, embodiment and gestural musical interaction can provide social benefits that are not available with standard computer based interactive music [3, 4].

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 139.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 179.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 179.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    For reference, the motor noises peaked at 51.3 dBA measured at a distance of 1.5 m length and 1.5 m height from the center of the base of the robot. The measurements were made using a calibrated Apex 435 condenser microphone. Measured under the same conditions, without the motors running, the ambient noise in the room was measured at 42.5 dBA.

References

  1. Kidd, Cory D., and Cynthia Breazeal. 2004. Effect of a robot on user perceptions. In Proceedings of 2004 IEEE/RSJ international conference on intelligent robots and systems, 2004(IROS 2004), vol. 4, 3559–3564. IEEE.

    Google Scholar 

  2. Bainbridge, Wilma A., Justin Hart, Elizabeth S. Kim, and Brian Scassellati. 2008. The effect of presence on human-robot interaction. In The 17th IEEE international symposium on robot and human interactive communication, 2008. RO-MAN 2008, 701–706. IEEE.

    Google Scholar 

  3. Gil, Weinberg, and Driscoll Scott. 2006. Toward robotic musicianship. Computer Music Journal 30 (4): 28–45.

    Article  Google Scholar 

  4. Weinberg, Gil, Beck Andrew, and Godfrey Mark. 2009. Zoozbeat: A gesture-based mobile music studio

    Google Scholar 

  5. Gil, Weinberg. 2005. Interconnected musical networks: Toward a theoretical framework. Computer Music Journal 29 (2): 23–39.

    Article  Google Scholar 

  6. Weinberg, Gil. 1999. Expressive digital musical instruments for children. PhD thesis, Massachusetts Institute of Technology.

    Google Scholar 

  7. Weinberg, Gil, Scott Driscoll, and Travis Thatcher. 2006. Jam’aa-a middle eastern percussion ensemble for human and robotic players. In International computer music conference, 464–467.

    Google Scholar 

  8. Luck, Geoff, and John A. Sloboda. 2009. Spatio-temporal cues for visually mediated synchronization. Music Perception: An Interdisciplinary Journal 26 (5): 465–473.

    Article  Google Scholar 

  9. Repp, Bruno H., and Amandine Penel. 2004. Rhythmic movement is attracted more strongly to auditory than to visual rhythms. Psychological Research 68 (4): 252–270.

    Article  Google Scholar 

  10. Dirk-Jan, Povel, and Essens Peter. 1985. Perception of temporal patterns. Music Perception: An Interdisciplinary Journal 2 (4): 411–440.

    Article  Google Scholar 

  11. Müller, Meinard. 2007. Dynamic time warping. Information Retrieval for Music and Motion 69–84.

    Google Scholar 

  12. Komatsu, Tomoaki, and Yoshihiro Miyake. 2004. Temporal development of dual timing mechanism in synchronization tapping task. In RO-MAN 2004. 13th IEEE international workshop on robot and human interactive communication (IEEE Catalog No. 04TH8759), 181–186. IEEE.

    Google Scholar 

  13. Crick, Christopher, Matthew Munz, and Brian Scassellati. 2006. Synchronization in social tasks: Robotic drumming. In ROMAN 2006-The 15th IEEE international symposium on robot and human interactive communication, 97–102. IEEE.

    Google Scholar 

  14. Inderbitzin, Martin, Aleksander Väljamäe, José Maria Blanco Calvo, Paul F. M. J. Verschure, and Ulysses Bernardet. Expression of emotional states during locomotion based on canonical parameters. In Ninth IEEE international conference on automatic face and gesture recognition (FG 2011), Santa Barbara, CA, USA, 21–25 March 2011, 809–814. IEEE.

    Google Scholar 

  15. Hillel, Aviezer, Trope Yaacov, and Todorov Alexander. 2012. Body cues, not facial expressions, discriminate between intense positive and negative emotions. Science 338 (6111): 1225–1229.

    Article  Google Scholar 

  16. de Gelder, Beatrice. 2006. Towards the neurobiology of emotional body language. Nature Reviews Neuroscience 7: 242–249.

    Article  Google Scholar 

  17. Nele, Dael Marcello Mortillaro, and R. Scherer Klaus. 2012. The body action and posture coding system (bap): Development and reliability. Journal of Nonverbal Behavior 36: 97–121.

    Article  Google Scholar 

  18. Mark, Coulson. 2004. Attributing emotion to static body postures: Recognition accuracy, confusions, and viewpoint dependence. Journal of Nonverbal Behavior 28 (2): 117–139.

    Article  Google Scholar 

  19. Krauss, Robert M., Palmer Morrel-Samuels, and Christina Colasante. 1991. Do conversational hand gestures communicate? Journal of Personality and Social Psychology 61 (5): 743.

    Article  Google Scholar 

  20. Kipp, Michael, and J.-C. Martin. 2009. Gesture and emotion: Can basic gestural form features discriminate emotions? In 3rd international conference on affective computing and intelligent interaction and workshops, 2009. ACII 2009, 1–8. IEEE.

    Google Scholar 

  21. Picard, Rosalind W. 1995. Affective computing.

    Google Scholar 

  22. Frijda, N.H. 1987. The emotions. London: Cambridge University Press.

    Google Scholar 

  23. Kozima, Hideki, and Hiroyuki Yano. 2001. In search of otogenetic prerequisites for embodied social intelligence. In Proceedings of the workshop on emergence and development on embodied cognition; international conference on cognitive science, 30–34.

    Google Scholar 

  24. Cynthia, Breazeal, and Aryananda Lijin. 2002. Recognition of affective communicative intent in robot-directed speech. Autonomous Robots 12 (1): 83–104.

    Article  MATH  Google Scholar 

  25. Castellano, Ginevra, Iolanda Leite, André Pereira, Carlos Martinho, Ana Paiva, and Peter W. McOwan. 2010. Affect recognition for interactive companions: challenges and design in real world scenarios. Journal on Multimodal User Interfaces 3 (1): 89–98.

    Article  Google Scholar 

  26. Scheutz, Matthias, Paul Schermerhorn, and James Kramer. 2006. The utility of affect expression in natural language interactions in joint human-robot tasks. In Proceedings of the 1st ACM SIGCHI/SIGART conference on human-robot interaction, 226–233. ACM.

    Google Scholar 

  27. Laurence, Devillers, Vidrascu Laurence, and Lamel Lori. 2005. 2005 special issue: Challenges in real-life emotion annotation and machine learning based detection. Neural Networks 18 (4): 407–422.

    Article  Google Scholar 

  28. Albert, Mehrabian. 1996. Pleasure-arousal-dominance: A general framework for describing and measuring individual differences in temperament. Current Psychology 14 (4): 261–292.

    Article  Google Scholar 

  29. Russell, James A. 2009. Emotion, core affect, and psychological construction. Cognition and Emotion 23 (7): 1259–1283.

    Article  Google Scholar 

  30. Lindquist, Kristen A., Tor D. Wager, Hedy Kober, Eliza Bliss-Moreau, and Lisa Feldman Barrett. 2012. The brain basis of emotion: A meta-analytic review. Behavioral and Brain Sciences 35 (03): 121–143.

    Article  Google Scholar 

  31. Katherine, Vytal, and Hamann Stephan. 2010. Neuroimaging support for discrete neural correlates of basic emotions: A voxel-based meta-analysis. Journal of Cognitive Neuroscience 22 (12): 2864–2885.

    Article  Google Scholar 

  32. Stephan, Hamann. 2012. Mapping discrete and dimensional emotions onto the brain: controversies and consensus. Trends in Cognitive Sciences.

    Google Scholar 

  33. Giovanna, Colombetti. 2009. From affect programs to dynamical discrete emotions. Philosophical Psychology 22 (4): 407–425.

    Article  Google Scholar 

  34. Barrett, Lisa Feldman, Maria Gendron, and Yang-Ming Huang. 2009. Do discrete emotions exist? Philosophical Psychology 22 (4): 427–437.

    Article  Google Scholar 

  35. John, Lasseter. 1987. Principles of traditional animation applied to 3D computer animation. Computer Graphics 21 (4): 35–44.

    Article  Google Scholar 

  36. Gielniak, Michael J., and Andrea L. Thomaz. 2011. Anticipation in robot motion.

    Google Scholar 

  37. Cauell, Justine, Tim Bickmore, Lee Campbell, and Hannes Vilhjdlmsson. 2000. Designing embodied conversational agents. Embodied Conversational Agents 29.

    Google Scholar 

  38. Nayak, Vishal, and Matthew Turk. 2005. Emotional expression in virtual agents through body language. Advances in Visual Computing 313–320.

    Google Scholar 

  39. Salem, Maha, Stefan Kopp, Ipke Wachsmuth, and Frank Joublin. 2010. Generating robot gesture using a virtual agent framework. In 2010 IEEE/RSJ international conference on intelligent robots and systems (IROS), 3592–3597. IEEE.

    Google Scholar 

  40. Riek, Laurel D., T.-C. Rabinowitch, Paul Bremner, Anthony G. Pipe, Mike Fraser, and Peter Robinson. 2010. Cooperative gestures: Effective signaling for humanoid robots. In 2010 5th ACM/IEEE international conference on human-robot interaction (HRI), 61–68. IEEE.

    Google Scholar 

  41. Moon, A.J., Chris A.C. Parker, Elizabeth A. Croft, and H.F. Van der Loos., 2013. Design and impact of hesitation gestures during human-robot resource conflicts. Journal of Human-Robot Interaction 2 (3): 18–40.

    Google Scholar 

  42. Maha, Salem, Kopp Stefan, Wachsmuth Ipke, Rohlfing Katharina, and Joublin Frank. 2012. Generation and evaluation of communicative robot gesture. International Journal of Social Robotics 4 (2): 201–217.

    Article  Google Scholar 

  43. Breazeal, Cynthia, Andrew Wang, and Rosalind Picard. 2007. Experiments with a robotic computer: body, affect and cognition interactions. In 2007 2nd ACM/IEEE international conference on human-robot interaction (HRI), 153–160. IEEE.

    Google Scholar 

  44. Hoffman, Guy, and Cynthia Breazeal. 2008. Anticipatory perceptual simulation for human-robot joint practice: Theory and application study. In Proceedings of the 23rd national conference on artificial intelligence—Volume 3, AAAI’08, 1357–1362. AAAI Press.

    Google Scholar 

  45. Michalowski, Marek P., Selma Sabanovic, and Hideki Kozima. 2007. A dancing robot for rhythmic social interaction. In 2007 2nd ACM/IEEE international conference on human-robot interaction (HRI), 89–96. IEEE.

    Google Scholar 

  46. Monceaux, Jérôme, Joffrey Becker, Céline Boudier, and Alexandre Mazel. 2009. Demonstration: First steps in emotional expression of the humanoid robot nao. In Proceedings of the 2009 international conference on multimodal interfaces, 235–236. ACM.

    Google Scholar 

  47. Grunberg, David K., Alyssa M. Batula, Erik M. Schmidt, and Youngmoo E. Kim. 2012. Synthetic emotions for humanoids: Perceptual effects of size and number of robot platforms. International Journal of Synthetic Emotions (IJSE) 3 (2): 68–83.

    Article  Google Scholar 

  48. Kidd, Cory David. 2003. Sociable robots: The role of presence and task in human-robot interaction. PhD thesis, Massachusetts Institute of Technology.

    Google Scholar 

  49. Walters, Michael L., Kerstin Dautenhahn, René Te Boekhorst, Kheng Lee Koay, Dag Sverre Syrdal, and Chrystopher L. Nehaniv. 2009. An empirical framework for human-robot proxemics. Procs of New Frontiers in Human-Robot Interaction.

    Google Scholar 

  50. Takayama, Leila, and Caroline Pantofaru. 2009. Influences on proxemic behaviors in human-robot interaction. In IEEE/RSJ international conference on intelligent robots and systems, 2009. IROS 2009, 5495–5502. IEEE.

    Google Scholar 

  51. Mead, Ross, Amin Atrash, and Maja J. Mataric. 2011. Recognition of spatial dynamics for predicting social interaction. In Proceedings of the 6th international conference on human-robot interaction, 201–202. ACM.

    Google Scholar 

  52. Breazeal, C. 2003. Emotion and sociable humanoid robots. International Journal of Human-Computer Studies 15: 119–155.

    Article  Google Scholar 

  53. Velásquez, Juan D. 1997. Modeling emotions and other motivations in synthetic agents. In Proceedings of the national conference on artificial intelligence, 10–15. Citeseer.

    Google Scholar 

  54. Xia, Guangyu, Roger Dannenberg, Junyun Tay, and Manuela Veloso. Autonomous robot dancing driven by beats and emotions of music. In Proceedings of the 11th international conference on autonomous agents and multiagent systems—Volume 1, AAMAS ’12, 205–212; Richland, S.C. 2012. International foundation for autonomous agents and multiagent systems.

    Google Scholar 

  55. Traue, Harald C., Frank Ohl, André Brechmann, Friedhelm Schwenker, Henrik Kessler, Kerstin Limbrecht, Holger Hoffmann, Stefan Scherer, Michael Kotzyba, Andreas Scheck, et al. 2013. A framework for emotions and dispositions in man-companion interaction. Coverbal Synchrony in Human-Machine Interaction, 99.

    Google Scholar 

  56. Frijda, N.H. 1995. Emotions in robots. Comparative approaches to cognitive science, ed. by H.L. Roitblat and J.-A. Meyer, 501–516.

    Google Scholar 

  57. Rolls, E.T. 2005. Emotion explained. USA: Oxford University Press.

    Book  Google Scholar 

  58. Walbott, H.G. 1998. Bodily expression of emotion. European Journal of Social Psychology 28 (6): 879–896.

    Article  Google Scholar 

  59. Darwin, Charles. 1916. The expression of the emotions in man and animals. D. Appleton and Co., New York. http://www.biodiversitylibrary.org/bibliography/4820.

  60. Sullivan, Jean, Linda A. Camras, and Michel George. 1993. Do infants express discrete emotions? Adult judgments of facial, vocal, and body actions. Journal of Nonverbal Behavior 17: 171–186.

    Article  Google Scholar 

  61. The echo nest. http://echonest.com/, 2014.

  62. Ghias, Asif, Jonathan Logan, David Chamberlin, and Brian C. Smith. 1995. Query by humming: Musical information retrieval in an audio database. In Proceedings of the third ACM international conference on multimedia, 231–236. ACM.

    Google Scholar 

  63. Shi, Jianbo, and Carlo Tomasi. 1994. Good features to track. In 1994 IEEE computer society conference on computer vision and pattern recognition, 1994. Proceedings CVPR’94, 593–600. IEEE.

    Google Scholar 

  64. Hoffman, Guy. 2012. Dumb robots, smart phones: A case study of music listening companionship. In 2012 IEEE, RO-MAN, 358–363. IEEE.

    Google Scholar 

  65. Puckette, Miller S., Miller S. Puckette Ucsd, Theodore Apel, et al. 1998. Real-time audio analysis tools for Pd and MSP.

    Google Scholar 

  66. Davies, Matthew E.P., and Mark D. Plumbley. 2004. Causal tempo tracking of audio. In ISMIR.

    Google Scholar 

  67. Sun, Sisi, Trishul Mallikarjuna, and Gil Weinberg. Effect of visual cues in synchronization of rhythmic patterns.

    Google Scholar 

  68. Albin, Aaron, S. Lee, and Parag Chordia. 2011. Visual anticipation aids in synchronization tasks. In Proceedings of the 2007 society for music perception and cognition (SMPC).

    Google Scholar 

  69. Burkhardt, Felix. 2005. Emofilt: The simulation of emotional speech by prosody-transformation. INTERSPEECH 509–512.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Gil Weinberg .

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Weinberg, G., Bretan, M., Hoffman, G., Driscoll, S. (2020). “Be Social”—Embodied Human-Robot Musical Interactions. In: Robotic Musicianship. Automation, Collaboration, & E-Services, vol 8. Springer, Cham. https://doi.org/10.1007/978-3-030-38930-7_5

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-38930-7_5

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-38929-1

  • Online ISBN: 978-3-030-38930-7

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics