Journal on Multimodal User Interfaces

, Volume 3, Issue 1–2, pp 109–118 | Cite as

Communication of musical expression by means of mobile robot gestures

  • Birgitta Burger
  • Roberto BresinEmail author
Original Paper


We developed a robotic system that can behave in an emotional way. A 3-wheeled simple robot with limited degrees of freedom was designed. Our goal was to make the robot displaying emotions in music performance by performing expressive movements. These movements have been compiled and programmed based on literature about emotion in music, musicians’ movements in expressive performances, and object shapes that convey different emotional intentions. The emotions happiness, anger, and sadness have been implemented in this way. General results from behavioral experiments show that emotional intentions can be synthesized, displayed and communicated by an artificial creature, also in constrained circumstances.

Musical robotics Emotion Movement 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Supplementary material

Below is the link to the electronic supplementary material. (MOV 12.2 MB)

Below is the link to the electronic supplementary material. (MOV 11.5 MB)

Below is the link to the electronic supplementary material. (MOV 21.6 MB)


  1. 1.
    Picard RW (1997) Affective computing. MIT Press, Cambridge Google Scholar
  2. 2.
    Hashimoto S (2004) In: Sincák P, Vascák J, Hirota K (eds) Humanoid robot for kansei communication—computer must have body. Advances in fuzzy systems—applications and theory, vol 21. World Scientific, Singapore, pp 357–370 Google Scholar
  3. 3.
    JSKE. What is kansei engineering?
  4. 4.
    Frijda NH (1995) In: Roitblat HL, Meyer JA (eds) Comparative approaches to cognitive science. MIT Press, Cambridge Google Scholar
  5. 5.
    Breazeal CL, Brooks RA (2005) In: Fellous JM, Arbib MA (eds) Who needs emotions? The brain meets the robot. Oxford University Press, Oxford, pp 271–310 Google Scholar
  6. 6.
    Lego Mindstorms NTX.
  7. 7.
    Leman M (2007) Embodied music cognition and mediation technology. MIT Press, Cambridge Google Scholar
  8. 8.
    Godøy RI, Leman M (2009) Musical gestures. Routledge, London Google Scholar
  9. 9.
    Schmidt L (2005) Proceedings of the 2nd Asia pacific society for the cognitive science of music APSCOM05, pp 148–154 Google Scholar
  10. 10.
  11. 11.
    Isbister K, Höök K, Sharp M, Laaksolahti J (2006) In: Proceedings of the conference on human factors in computing systems (CHI), Montréal, pp 1163–1172 Google Scholar
  12. 12.
  13. 13.
    Juslin PN (2001) In: Juslin PN, Sloboda JA (eds) Music and emotion: theory and research. University Press, Oxford, pp 309–337 Google Scholar
  14. 14.
    Giordano B, Bresin R (2006) In: Baroni M, Addessi AR, Caterina R, Costa M (eds) ICMPC9—9th international conference on music perception & cognition. Bonomia University Press, Bologna, p 149. Abstract only Google Scholar
  15. 15.
    Juslin PN, Lindström E (2004) Musical expression of emotions: modeling composed and performed features. Manuscript in preparation Google Scholar
  16. 16.
    Bresin R (2005) In: International computer music conference, Barcelona, pp 367–370.
  17. 17.
    Pfeifer R, Scheier C (1999) Understanding intelligence. MIT Press, Cambridge Google Scholar
  18. 18.
    Dahl S (2005) On the beat: Human movement and timing in the production and perception of music. PhD thesis, TMH, KTH Stockholm.
  19. 19.
    Dautenhahn K, Werry I (2002) In: IEEE/RSJ conferenceon intelligent robots and systems, Lausanne, pp 1132–1138.

Copyright information

© OpenInterface Association 2009

Authors and Affiliations

  1. 1.Finnish Centre of Excellence in Interdisciplinary Music Research, Department of MusicUniversity of JyväskyläJyväskyläFinland
  2. 2.CSC School of Computer Science and Communication, TMH Dept. of Speech, Music and HearingKTH Royal Institute of TechnologyStockholmSweden

Personalised recommendations