Mediated Musical Interactions in Virtual Environments

  • Rob HamiltonEmail author
Part of the Springer Series on Cultural Computing book series (SSCC)


Musical interactions performed within virtual spaces are mediated by the rules and realities inherent to each environment itself. Control interactions in physical space driving actor motion and gesture in virtual spaces pass through a layer of influence dictated by the environment—as well as by any additional actors or processes existing within that environment—before being mapped to parameters of sonic and musical creation and control. Such mediation layers are themselves intrinsic attributes of musical works built within game and virtual environments and play fundamental roles in the sonic and musical experiences realized in such spaces. These meditated musical interactions and the interfaces that support them should be considered and approached as a distinct form of musical interaction with their own performance practices, approaches and taxonomies. This paper explores the notion of musical mediation as a composed and designed attribute of real-time virtual musical performance environments and describes three virtual musical performance environments designed to act as mediation layers upon musical performance gesture and intention carried out within.



This work was supported in part by generous hardware grants by Nvidia and Leap Motion.


  1. Beaudouin-Lafon M (2000) Instrumental interaction: an interaction model for designing post-WIMP user interfaces. In: Proceedings of the SIGCHI conference on human factors in computing systems, ACM, New York, NY, USA. pp 446–453Google Scholar
  2. Born G (2005) On musical mediation: ontology, technology and creativity. In: Twentieth-century music, vol 2, pp 7–36. [Accessed 10th October 2017]CrossRefGoogle Scholar
  3. Chadabe J (2002) The limitations of mapping as a structural descriptive in electronic instruments. In: Proceedings of the 2002 conference on new instruments for musical expression, Dublin, Ireland, pp 1–5Google Scholar
  4. Cook P (2002) Real sound synthesis for interactive applications. A. K. Peters, Ltd., Natick, MA, USACrossRefGoogle Scholar
  5. Dahl S, Bevilacqua F, Rasamimanana N, Bresin R, Clayton M, Leante L (2009) Gesture in performance, musical gestures. In: Godoy RI, Leman M (eds) Sound, movement, and meaning. RoutledgeGoogle Scholar
  6. Hamilton R (2008) q3osc: or how I learned to stop worrying and love the game. In: Proceedings of the international computer music association conference, Belfast, IrelandGoogle Scholar
  7. Hamilton R (2011) UDKOSC: an immersive musical environment. In: Proceedings of the international computer music association conference, Huddersfield, UK, pp 717–720Google Scholar
  8. Hamilton R (2014) The Procedural sounds and music of ECHO::Canyon. In: Proceedings of the international computer music association conference, Athens, Greece, pp 449–455Google Scholar
  9. Hamilton R, Cáceres J-P, Nanou C, Platz P (2011a) Multi-modal musical environments for mixed-reality performance. J Multimodal User Interfaces 4:147–156. [Accessed 10 October, 2017]CrossRefGoogle Scholar
  10. Hamilton R, Platz C (2016) Gesture-based collaborative virtual reality performance in carillon. In: Proceedings of the international computer music association conference, Utrecht, Netherlands. [Accessed 10 October, 2017]
  11. Hamilton R, Smith J, Wang G (2011b) Social composition: musical data systems for expressive mobile music. Leonardo Music J 21:57–64CrossRefGoogle Scholar
  12. Hennion A (2002) Music and mediation: towards a new sociology of music. In: Clayton M, Herbert T, Middleton R (eds) The cultural study of music: a critical introduction. Routledge, London, pp 80–91Google Scholar
  13. Leman M (2007) Embodied music cognition and mediation technology. The MIT PressGoogle Scholar
  14. Maes P-J, Leman M, Lesaffre M, Demey M, Moelants D (2010) From expressive gesture to sound. J Multimodal User Interfaces 3(1–2):67–78CrossRefGoogle Scholar
  15. Malloch J, Wanderley MM (2017) Embodied cognition and digital musical instruments: design and performance. In: Lesaffre M, Maes P-J, Leman M (eds) The Routledge companion to embodied music interaction. Routledge, New York, pp 440–449Google Scholar
  16. Mathews MV, Moore FR (1970) GROOVE—a program to compose, store, and edit functions of time. Commun ACM 13(12):715–721. [Accessed: 10 October 2017]CrossRefGoogle Scholar
  17. McCartney J (2002) Rethinking the computer music language: supercollider. Comput Music J 26(4):61–68. [Accessed 10 October 2017]MathSciNetCrossRefGoogle Scholar
  18. Pearson T (2010) Visions of sound: avatar orchestra metaverse (Online). [Accessed 10 October 2017]
  19. Puckette M (1996) Pure data. In: Proceedings of the international computer music conference, San Francisco, California, pp 269–272Google Scholar
  20. Rasmussen J (1986) Information processing and human-machine interaction: an approach to cognitive engineering. Elsevier Science Inc, New York, NY, USAGoogle Scholar
  21. Shneiderman B (1983) Direct manipulation. A step beyond programming languages. IEEE Comput 1(8):57–69CrossRefGoogle Scholar
  22. Tonon M (2013) Theory and the object: making sense of Adorno’s concept of mediation. Int J Philos Stud 21(2):184–203. [Accessed 10 October 2017]CrossRefGoogle Scholar
  23. Wanderley MM (2001) Performer-instrument interaction: applications to gestural control of music. PhD dissertation, University Pierre Marie Curie—Paris VI, Paris, FranceGoogle Scholar
  24. Wang G (2008) The ChucK audio programming language: a strongly-timed and on-the-fly environ/mentality. PhD thesis, Princeton UniversityGoogle Scholar
  25. Winkler T (1995) Making motion musical: gesture mapping strategies for interactive computer music. In: Proceedings of the international computer music association conference, San Francisco, pp 261–264Google Scholar
  26. Wright M, Freed A, Momeni A (2003) OpenSound control: state of the art 2003. In: Proceedings of the 2003 conference on new interfaces for musical expression (NIME ‘03), Montreal, Canada, pp 153–159Google Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.Rensselaer Polytechnic InstituteTroyUSA

Personalised recommendations