Skip to main content

Conveying Audience Emotions Through Humanoid Robot Gestures to an Orchestra During a Live Musical Exhibition

  • Conference paper
  • First Online:
Complex, Intelligent, and Software Intensive Systems (CISIS 2017)

Abstract

In the last twenty years, robotics have been applied in many heterogeneous contexts. Among them, the use of humanoid robots during musical concerts have been proposed and investigated by many authors. In this paper, we propose a contribution in the area of robotics application in music, consisting of a system for conveying audience emotions during a live musical exhibition, by means of a humanoid robot. In particular, we provide all spectators with a mobile app, by means of which they can select a specific color while listening to a piece of music (act). Each color is mapped to an emotion, and the audience preferences are then processed in order to select the next act to be played. This decision, based on the overall emotion felt by the audience, is then communicated by the robot through body gestures to the orchestra. Our first results show that spectators enjoy such kind of interactive musical performance, and are encouraging for further investigations.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 259.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 329.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Anzalone, S.M., Tilmont, E., Boucenna, S., Xavier, J., Jouen, A.L., Bodeau, N., Maharatna, K., Chetouani, M., Cohen, D., Group, M.S., et al.: How children with autism spectrum disorder behave and explore the 4-dimensional (spatial 3d+ time) environment during a joint attention induction task with a robot. Res. Autism Spectr. Dis. 8(7), 814–826 (2014)

    Article  Google Scholar 

  2. Anzalone, S., Cinquegrani, F., Sorbello, R., Chella, A.: An emotional humanoid partner. In: Proceedings of the 1st International Symposium on Linguistic and Cognitive Approaches to Dialog Agents - A Symposium at the AISB 2010 Convention, pp. 1–6 (2010)

    Google Scholar 

  3. Augello, A., Infantino, I., Pilato, G., Rizzo, R., Vella, F.: Binding representational spaces of colors and emotions for creativity. Biol. Inspired Cogn. Architectures 5, 64–71 (2013)

    Article  Google Scholar 

  4. Brown, L., Howard, A.M.: Gestural behavioral implementation on a humanoid robotic platform for effective social interaction. In: The 23rd IEEE International Symposium on Robot and Human Interactive Communication, pp. 471–476 (2014)

    Google Scholar 

  5. Burger, B., Bresin, R.: Communication of musical expression by means of mobile robot gestures. J. Multimodal User Interfaces 3(1), 109–118 (2010)

    Article  Google Scholar 

  6. Chella, A., Sorbello, R., Pilato, G., Vassallo, G., Balistreri, G., Giardina, M.: An architecture with a mobile phone interface for the interaction of a human with a humanoid robot expressing emotions and personality. In: Congress of the Italian Association for Artificial Intelligence, pp. 117–126. Springer (2011)

    Google Scholar 

  7. Cowie, R., Cornelius, R.R.: Describing the emotional states that are expressed in speech. Speech Commun. 40(1–2), 5–32 (2003)

    Article  MATH  Google Scholar 

  8. Ekman, P.: Basic Emotions, pp. 45–60. Wiley, New York (2005)

    Google Scholar 

  9. Gentile, V., Sorce, S., Gentile, A.: Continuous hand openness detection using a kinect-like device. In: 2014 Eighth International Conference on Complex, Intelligent and Software Intensive Systems, pp. 553–557 (2014)

    Google Scholar 

  10. Gentile, V., Sorce, S., Malizia, A., Gentile, A.: Gesture recognition using low-cost devices: Techniques, applications, perspectives (Riconoscimento di gesti mediante dispositivi a basso costo: Tecniche, applicazioni, prospettive). Mondo Digitale 15(63), 161–169 (2016)

    Google Scholar 

  11. Gentile, V., Milazzo, F., Sorce, S., Gentile, A., Pilato, G., Augello, A.: Body gestures and spoken sentences: a novel approach for revealing user’s emotions. In: Proceedings of 11th International Conference on Semantic Computing (IEEE ICSC 2017) (2017)

    Google Scholar 

  12. Hoffman, G., Bauman, S., Vanunu, K.: Robotic experience companionship in music listening and video watching. Pers. Ubiquit. Comput. 20(1), 51–63 (2016)

    Article  Google Scholar 

  13. Lim, A., Ogata, T., Okuno, H.G.: Towards expressive musical robots: a cross-modal framework for emotional gesture, voice and music. EURASIP J. Audio Speech Music Process. 2012(1), 3 (2012)

    Article  Google Scholar 

  14. McCallum, L., McOwan, P.W.: Face the music and glance: how nonverbal behaviour aids human robot relationships based in music. In: Proceedings of the Tenth Annual ACM/IEEE International Conference on Human-Robot Interaction, pp. 237–244, HRI 2015, NY, USA. ACM, New York (2015)

    Google Scholar 

  15. Meudt, S., Schmidt-Wack, M., Honold, F., Schüssel, F., Weber, M., Schwenker, F., Palm, G.: Going further in affective computing: how emotion recognition can improve adaptive user interaction, pp. 73–103. Springer, Cham (2016)

    Google Scholar 

  16. Posner, J., Russel, J.A., Peterson, B.S.: The circumplex model of affect: an integrative approach to affective neuroscience, cognitive development, and psychopathology. Dev. Psychopathol. 17(3), 715–734 (2005)

    Article  Google Scholar 

  17. Russell, J.A.: A circumplex model of affect. J. Pers. Soc. Psychol. 39(6), 1161–1178 (1980)

    Article  Google Scholar 

  18. Soleymani, M., Caro, M.N., Schmidt, E.M., Sha, C.Y., Yang, Y.H.: 1000 songs for emotional analysis of music. In: Proceedings of the 2nd ACM International Workshop on Crowdsourcing for Multimedia, pp. 1–6, CrowdMM 2013, NY, USA. ACM, New York (2013)

    Google Scholar 

  19. Sorbello, R., Chella, A., Calí, C., Giardina, M., Nishio, S., Ishiguro, H.: Telenoid android robot as an embodied perceptual social regulation medium engaging natural human-humanoid interaction. Robot. Auton. Syst. 62(9), 1329–1341 (2014). Intelligent Autonomous Systems

    Article  Google Scholar 

  20. Sorbello, R., Chella, A., Giardina, M., Nishio, S., Ishiguro, H.: An architecture for telenoid robot as empathic conversational android companion for elderly people. In: Intelligent Autonomous Systems, vol. 13, pp. 939–953. Springer (2016)

    Google Scholar 

  21. Spataro, R., Chella, A., Allison, B., Giardina, M., Sorbello, R., Tramonte, S., Guger, C., La Bella, V.: Reaching and grasping a glass of water by locked-in ALS patients through a BCI-controlled humanoid robot. Front. Hum. Neurosci. 11, 68 (2017)

    Article  Google Scholar 

  22. Tkalčič, M., De Carolis, B., de Gemmis, M., Odić, A., Košir, A.: Introduction to Emotions and Personality in Personalized Systems, pp. 3–11. Springer, Cham (2016)

    Google Scholar 

Download references

Acknowledgements

The authors wish to thank the musical ensemble members, the students Corsello, Caravello and the professors Betta, Correnti, D’Aquila and Rapisarda from the Conservatorio di Musica “Vincenzo Bellini” di Palermo for their fundamental contribution for the realization of the musical performance and for their invaluable open-mindedness.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Marcello Giardina .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer International Publishing AG

About this paper

Cite this paper

Giardina, M. et al. (2018). Conveying Audience Emotions Through Humanoid Robot Gestures to an Orchestra During a Live Musical Exhibition. In: Barolli, L., Terzo, O. (eds) Complex, Intelligent, and Software Intensive Systems. CISIS 2017. Advances in Intelligent Systems and Computing, vol 611. Springer, Cham. https://doi.org/10.1007/978-3-319-61566-0_24

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-61566-0_24

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-61565-3

  • Online ISBN: 978-3-319-61566-0

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics