Abstract
In the last twenty years, robotics have been applied in many heterogeneous contexts. Among them, the use of humanoid robots during musical concerts have been proposed and investigated by many authors. In this paper, we propose a contribution in the area of robotics application in music, consisting of a system for conveying audience emotions during a live musical exhibition, by means of a humanoid robot. In particular, we provide all spectators with a mobile app, by means of which they can select a specific color while listening to a piece of music (act). Each color is mapped to an emotion, and the audience preferences are then processed in order to select the next act to be played. This decision, based on the overall emotion felt by the audience, is then communicated by the robot through body gestures to the orchestra. Our first results show that spectators enjoy such kind of interactive musical performance, and are encouraging for further investigations.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Anzalone, S.M., Tilmont, E., Boucenna, S., Xavier, J., Jouen, A.L., Bodeau, N., Maharatna, K., Chetouani, M., Cohen, D., Group, M.S., et al.: How children with autism spectrum disorder behave and explore the 4-dimensional (spatial 3d+ time) environment during a joint attention induction task with a robot. Res. Autism Spectr. Dis. 8(7), 814–826 (2014)
Anzalone, S., Cinquegrani, F., Sorbello, R., Chella, A.: An emotional humanoid partner. In: Proceedings of the 1st International Symposium on Linguistic and Cognitive Approaches to Dialog Agents - A Symposium at the AISB 2010 Convention, pp. 1–6 (2010)
Augello, A., Infantino, I., Pilato, G., Rizzo, R., Vella, F.: Binding representational spaces of colors and emotions for creativity. Biol. Inspired Cogn. Architectures 5, 64–71 (2013)
Brown, L., Howard, A.M.: Gestural behavioral implementation on a humanoid robotic platform for effective social interaction. In: The 23rd IEEE International Symposium on Robot and Human Interactive Communication, pp. 471–476 (2014)
Burger, B., Bresin, R.: Communication of musical expression by means of mobile robot gestures. J. Multimodal User Interfaces 3(1), 109–118 (2010)
Chella, A., Sorbello, R., Pilato, G., Vassallo, G., Balistreri, G., Giardina, M.: An architecture with a mobile phone interface for the interaction of a human with a humanoid robot expressing emotions and personality. In: Congress of the Italian Association for Artificial Intelligence, pp. 117–126. Springer (2011)
Cowie, R., Cornelius, R.R.: Describing the emotional states that are expressed in speech. Speech Commun. 40(1–2), 5–32 (2003)
Ekman, P.: Basic Emotions, pp. 45–60. Wiley, New York (2005)
Gentile, V., Sorce, S., Gentile, A.: Continuous hand openness detection using a kinect-like device. In: 2014 Eighth International Conference on Complex, Intelligent and Software Intensive Systems, pp. 553–557 (2014)
Gentile, V., Sorce, S., Malizia, A., Gentile, A.: Gesture recognition using low-cost devices: Techniques, applications, perspectives (Riconoscimento di gesti mediante dispositivi a basso costo: Tecniche, applicazioni, prospettive). Mondo Digitale 15(63), 161–169 (2016)
Gentile, V., Milazzo, F., Sorce, S., Gentile, A., Pilato, G., Augello, A.: Body gestures and spoken sentences: a novel approach for revealing user’s emotions. In: Proceedings of 11th International Conference on Semantic Computing (IEEE ICSC 2017) (2017)
Hoffman, G., Bauman, S., Vanunu, K.: Robotic experience companionship in music listening and video watching. Pers. Ubiquit. Comput. 20(1), 51–63 (2016)
Lim, A., Ogata, T., Okuno, H.G.: Towards expressive musical robots: a cross-modal framework for emotional gesture, voice and music. EURASIP J. Audio Speech Music Process. 2012(1), 3 (2012)
McCallum, L., McOwan, P.W.: Face the music and glance: how nonverbal behaviour aids human robot relationships based in music. In: Proceedings of the Tenth Annual ACM/IEEE International Conference on Human-Robot Interaction, pp. 237–244, HRI 2015, NY, USA. ACM, New York (2015)
Meudt, S., Schmidt-Wack, M., Honold, F., Schüssel, F., Weber, M., Schwenker, F., Palm, G.: Going further in affective computing: how emotion recognition can improve adaptive user interaction, pp. 73–103. Springer, Cham (2016)
Posner, J., Russel, J.A., Peterson, B.S.: The circumplex model of affect: an integrative approach to affective neuroscience, cognitive development, and psychopathology. Dev. Psychopathol. 17(3), 715–734 (2005)
Russell, J.A.: A circumplex model of affect. J. Pers. Soc. Psychol. 39(6), 1161–1178 (1980)
Soleymani, M., Caro, M.N., Schmidt, E.M., Sha, C.Y., Yang, Y.H.: 1000 songs for emotional analysis of music. In: Proceedings of the 2nd ACM International Workshop on Crowdsourcing for Multimedia, pp. 1–6, CrowdMM 2013, NY, USA. ACM, New York (2013)
Sorbello, R., Chella, A., Calí, C., Giardina, M., Nishio, S., Ishiguro, H.: Telenoid android robot as an embodied perceptual social regulation medium engaging natural human-humanoid interaction. Robot. Auton. Syst. 62(9), 1329–1341 (2014). Intelligent Autonomous Systems
Sorbello, R., Chella, A., Giardina, M., Nishio, S., Ishiguro, H.: An architecture for telenoid robot as empathic conversational android companion for elderly people. In: Intelligent Autonomous Systems, vol. 13, pp. 939–953. Springer (2016)
Spataro, R., Chella, A., Allison, B., Giardina, M., Sorbello, R., Tramonte, S., Guger, C., La Bella, V.: Reaching and grasping a glass of water by locked-in ALS patients through a BCI-controlled humanoid robot. Front. Hum. Neurosci. 11, 68 (2017)
Tkalčič, M., De Carolis, B., de Gemmis, M., Odić, A., Košir, A.: Introduction to Emotions and Personality in Personalized Systems, pp. 3–11. Springer, Cham (2016)
Acknowledgements
The authors wish to thank the musical ensemble members, the students Corsello, Caravello and the professors Betta, Correnti, D’Aquila and Rapisarda from the Conservatorio di Musica “Vincenzo Bellini” di Palermo for their fundamental contribution for the realization of the musical performance and for their invaluable open-mindedness.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer International Publishing AG
About this paper
Cite this paper
Giardina, M. et al. (2018). Conveying Audience Emotions Through Humanoid Robot Gestures to an Orchestra During a Live Musical Exhibition. In: Barolli, L., Terzo, O. (eds) Complex, Intelligent, and Software Intensive Systems. CISIS 2017. Advances in Intelligent Systems and Computing, vol 611. Springer, Cham. https://doi.org/10.1007/978-3-319-61566-0_24
Download citation
DOI: https://doi.org/10.1007/978-3-319-61566-0_24
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-61565-3
Online ISBN: 978-3-319-61566-0
eBook Packages: EngineeringEngineering (R0)