Abstract
The systematic arrangement of sound in space is widely considered as one important compositional design category of Western art music and acoustic media art in the 20th century. A lot of attention has been paid to the artistic concepts of sound in space and its reproduction through loudspeaker systems. Much less attention has been attracted by live-interactive practices and tools for spatialisation as performance practice. As a contribution to this topic, the current study has conducted an inventory of controllers for the real time spatialisation of sound as part of musical performances, and classified them both along different interface paradigms and according to their scope of spatial control. By means of a literature study, we were able to identify 31 different spatialisation interfaces presented to the public in context of artistic performances or at relevant conferences on the subject. Considering that only a small proportion of these interfaces combines spatialisation and sound production, it seems that in most cases the projection of sound in space is not delegated to a musical performer but regarded as a compositional problem or as a separate performative dimension. With the exception of the mixing desk and its fader board paradigm as used for the performance of acousmatic music with loudspeaker orchestras, all devices are individual design solutions developed for a specific artistic context. We conclude that, if controllers for sound spatialisation were supposed to be perceived as musical instruments in a narrow sense, meeting certain aspects of instrumentality, immediacy, liveness, and learnability, new design strategies would be required.
Keywords
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsNotes
- 1.
Accordingly, the term spatial music was coined to highlight electroacoustic compositions in which the dynamic projection of sound sources is an integral part of compositional process. While the practice of spatialisation can be applied to any kind of spatial sound projection, it mainly refers to the field of electroacoustic music.
- 2.
Sound diffusion is originally used for the live presentation of acousmatic music, a form of electroacoustic music composed for (multiples of) loudspeakers using recorded sound material out of their original context. Interestingly, sound diffusion as performance practice is conceptually related to one specific control interface: the fader board of mixing desks (see our taxonomy).
- 3.
Since we cannot address the technical principles of sound field synthesis here, the reader can refer to Geier et al. (2010) for further details on wave field synthesis, ambisonics techniques and recent stereophonic panning methods.
- 4.
The transition from amplitude panning techniques to methods of sound field synthesis represents a paradigm shift of sound spatialisation (Geier et al. 2010): from a channel-based approach (controlling a single channel assigned to one loudspeaker) to an object-based approach (controlling a sound object in space).
- 5.
For a comprehensive review of spectral spatialisation techniques, see Jaroszewicz (2015).
- 6.
It might seem paradox to include stochastic processes to a category mainly defined by determined characteristics, however they are grouped here due to their decreased realtime controllability in terms of exact spatial deployment.
- 7.
This category may also include mapping strategies in which the synthesis process of the sound material directly affects its spatialisation, in contrast to the static spatialisation process of fixed audio material in the first category.
- 8.
One can consider Stockhausen’s Rotationstisch (a loudspeaker mounted to a rotating turntable system) as typical tool for spatial studio composition (Brech 2015). The spatialisation system used by Chowning to realize his simulation of moving sound sources (Chowning 1971) represents a typical studio approach. Simultaneously, it was clearly limited by processing performance of the 1970s (Zvonar 2000).
- 9.
There is consensus that Music for Solo Performer (1965) by Alvin Lucier, scored for “enormously amplified brainwaves and percussion”, was the first composition to make use of a biofeedback interface to control percussion instruments by the resonance of the performers brain activity (Miranda and Wanderley 2006). Several further artistic experiments have followed using biofeedback interfaces. Refer to Miranda and Castet (2014) for a comprehensive review on brain related interfaces.
- 10.
It remains a matter of ongoing discourse, whether certain kinds of production or reproduction devices (the record player or a mixing desk, for instance) can be considered as musical instruments. See Hardjowirogo (this volume), for a thorough discussion of musical instrument identity issues.
- 11.
The exact figure varies between 31 and 38 depending on the way of counting different versions or parallel developments of basically the same spatialisation instrument. In the following, we will consider the minimal size of the sample for the sake of simplicity.
- 12.
Again, the question might arise if this gestural interface can be considered as an augmented instrument linked to the discourse of whether a DJ-turntable represents a musical instrument or not. At this point, we avoid to comment on this topic by using the term augmented controller in reference to a well-established control interface for musical performances.
- 13.
The pupitre d’espace is a further development of a controller introduced in 1951 as pupitre potentiométrique de relief. The device had the same functionality but worked with controlling three wires which are linked to potentiometers to adjust the signal level send to each loudspeaker (Battier 2015, 127).
References
Baalman, Marije A. J. (2010). Spatial composition techniques and sound spatialisation technologies. Organised Sound, 15(3), 209–218. doi:10.1017/S1355771810000245
Battier, M. (2015). Recent discoveries in the spatial thought of early musique concrète. In M. Brech & R. Paland (Eds.), Compositions for audible space. The early electroacoustic music and its contexts. Music and sound culture (pp. 123–36). Columbia: Transcript Verlag.
Behrman, D. (2016). Personal interview with David Behrman in Berlin, May 14, 2016.
Bernardini, N. (1989). Trails: An interactive system for sound location. Ann Arbor, MI: Michigan Publishing, University of Michigan Library.
Birnbaum, D., Fiebrink, R., Malloch, J., & Wanderley, M. M. (2005). Towards a dimension space for musical devices. In Proceedings of the 2005 Conference on New Interfaces for Musical Expression (pp. 192–95). Singapore: National University of Singapore.
Blauert, J. (1997). Spatial hearing: The psychophysics of human sound localization. Cambridge: MIT press.
Bokowiec, M. A. (2011). VOCT (Ritual): An interactive vocal work for bodycoder system and 8 channel spatialization. In Proceedings of the International Conference on New Interfaces for Musical Expression.
Brech, M. (2015). Der Hörbare Raum Entdeckung, Erforschung Und Musikalische Gestaltung Mit Analoger Technologie. Columbia: Transcript Verlag.
Brech, M., & Paland, R. (Eds.). (2015). Compositions for audible space. The early electroacoustic music and its contexts. Music and sound culture. Columbia: Transcript Verlag.
Brech, M., & von Coler, H. (2015). Aspects of space in luigi nono’s prometeo and the use of the Halaphon. In M. Brech & R. Paland (Eds.), Compositions for audible space. The early electroacoustic music and its contexts. Music and sound culture (pp. 193–204). Columbia: Transcript Verlag.
Bredies, K., Alexander Mann, N., Ahrens, J., Geier, M., Spors, S., & Nischt, M. (2008). The multi-touch sound scape renderer. In Proceedings of the working conference on advanced visual interfaces (pp. 466–469). AVI ’08. New York, NY, USA: ACM. doi:10.1145/1385569.1385660
Brown, K., Alcorn, M., & Rebelo, P. (2005). Sound diffusion using hand-held light-emitting pen controllers. In proceedings of international computer music conference.
Caramiaux, B., Fdili Alaoui, S., Bouchara, T., Parseihian, G., & Rébillat, M. (2011). Gestural auditory and visual interactive platform. In 14th International conference on digital audio effects (DAFx-11), 69.
Carlson, C., Marschner, E., & Mccurry, H. (2011). The sound flinger: A haptic spatializer. In Proceedings ICMC 2011.
Chowning, J. M. (1971). The simulation of moving sound sources. Journal of the Audio Engineering Society, 19(1), 2–6.
Clozier, C. (2001). The gmebaphone concept and the cybernéphone instrument. Computer Music Journal, 25(4), 81–90.
Copeland, D. (2014). The NAISA spatialization system. April. http://www.darrencopeland.net/web2/?page_id=400
Diatkine, C., Bertet, S., & Ortiz, M. (2015). Towards the holistic spatialization of multiple sound sources in 3d, implementation using ambisonics to binaural technique. In Proceedings.
Fedorkow, G., Buxton, W., & Smith, K. C. (1978). A computer-controlled sound distribution system for the performance of electroacoustic music. Computer Music Journal, 33–42.
Ferguson, P. (2010). Development of a 3D audio panning and realtime visualisation toolset using emerging technologies. Ph.d. thesis, Edinburgh Napier.
Fohl, W., & Nogalski, M. (2013). A gesture control interface for a wave field synthesis system. In Proceedings of NIME 2013, 341–346.
Franco, S. (1974). Hardware design of a real-time musical system. Champaign: University of Illinois at Urbana-Champaign.
Geier, M., Spors, S., & Weinzierl, S. (2010). The future of audio reproduction: Technology—formats—applications. In M. Detyniecki, U. Leiner, & A. Nürnberger (Eds.), Adaptive multimedia retrieval. Identifying, summarizing, and recommending image and music (pp. 1–17). Springer.
Gertich, F., Gerlach, J., & Föllmer, G. (1996). Musik, Verwandelt: Das Elektronische Studio Der TU Berlin 1953–1995. Wolke.
Harada, T., Sato, A., Hashimoto, S, & Ohteru, S. (1992). Real time control of 3D sound space by gesture. In International computer music conference proceedings.
Harrison, J. (1999). Diffusion: Theories and practices, with particular reference to the BEAST system. eContact, 2.
Holmes, T. (2012). Electronic and experimental music: Technology, music, and culture. Routledge.
Jaroszewicz, M. (2015). Compositional strategies in spectral spatialization. California: University of California Riverside.
Johnson, B., & Kapur, A. (2013). Multi-touch interfaces for phantom source positioning in live sound diffusion. In W. S. Yeo, K. Lee, Alexander Sigman, (Hyunkyung) H. Ji, & G. Wakefield (Eds.,) 13th international conference on new interfaces for musical expression, NIME 2013, Daejeon, Republic of Korea, May 27–30, 2013 (pp. 213–216). nime.org.
Johnson, B., Murphy, J. W., & Kapur A. (2013). Designing gestural interfaces for live sound diffusion. In Proceedings of the 39th international computer music conference, ICMC 2013, Perth, Australia, August 12–16, 2013. Michigan Publishing.
Johnson, B., Norris, M., & Kapur, A. (2014a). Diffusing diffusion: A history of the technological advances in spatial performance. In International computer music conference proceedings 2014.
Johnson, B., Norris, M., & Kapur, A. (2014b). Tactile.motion: An iPad based performance interface for increased expressivity in diffusion performance. In Music technology meets philosophy—from digital echos to virtual ethos: Joint proceedings of the 40th international computer music conference, ICMC 2014, and the 11th sound and music computing conference, SMC 2014, Athens, Greece, September 14–20, 2014. Michigan Publishing.
Leitner, B. (1971). Sound architecture—space created through traveling sound. New York: ARTFORUM.
Leitner, B. (2016). Atelier Bernhard Leitner. In Amt der Niederösterreichischen Regierung. Bielefeld: Kerber Verlag.
Leslie, G., Zamborlin, B., Jodlowski, P., & Schnell, N. (2010). Grainstick: A collaborative, interactive sound installation.
Livingstone, D., & Miranda, E. (2005). Orb3: Adaptive interface design for real time sound synthesis and diffusion within socially mediated spaces (pp. 65–69). Singapore: National University of Singapore.
Lynch, H., & Sazdov, R. (2011). An ecologically valid experiment for the comparison of established spatial techniques. In International computer music conference proceedings 2011.
Malham, D. G. (1998). Approaches to spatialisation. Organised Sound, 3(2), 167–177.
Manning, P. (2013). Electronic and computer music. Oxford: Oxford University Press.
Marentakis, G., Peters, N., & McAdams, S. (2007). DJ spat: Spatialized interactions for DJs. In International computer music conference proceedings 2007.
Marshall, M. T., Malloch, J., & Wanderley, M. M. (2007). Gesture control of sound spatialization for live musical performance. In M. S. Dias, S. Gibet, M. M. Wanderley, & R. Bastos (Eds.), Gesture-based human-computer interaction and simulation. Lecture notes in computer science 5085 (pp. 227–38). Berlin: Springer.
Miranda, E., & Castet, J. (Eds.). (2014). Guide to brain-computer music interfacing. Berlin: Springer.
Miranda, E. R., & Wanderley, M. M. (2006). New digital musical instruments: Control and interaction beyond the keyboard (Vol. 21). AR Editions, Inc.
Mooney, J. (2005). Sound diffusion systems for the live performance of electroacoustic music. Sheffield: University of Sheffield.
Mooney, J. R., Moore, A., & Moore, D. (2004). M2 diffusion: The live diffusion of sound in space. In International computer music association.
Mulder, A. (2000). Towards a choice of gestural constraints for instrumental performers. In M. Wanderlay & M. Battier (Eds.), Trends in gestural control of music (pp. 315–35).
Ness, S., Odowichuk, G., Driessen, P., Tzanetakis, G. (2011). Controlling Real time sound spatialization using the radiodrum. In International computer music conference proceedings 2011.
Oliveros, P. (1991). The expanded instrument system (EIS). In International Computer Music Conference Proceedings 1991.
Oliveros, P. (2008). The expanded instrument system (EIS): An introduction and brief history. Future of Creative Technologies, Journal of the Institute of Creative Technologies, no., 1, 21–24.
Overholt, D. (2011). Violin-related HCI: A taxonomy elicited by the musical interface technology design space. In A. L Brooks (Ed.), Arts and technology—second international conference, ArtsIT 2011 (pp. 80–89). Berlin: Springer.
Pachet, F., & Delerue, O. (1999). Music space: A constraint-based control system for music spatialization. In International Computer Music Conference Proceedings 1999.
Paradiso, J. A. (1997). Electronic music: New ways to play. IEEE Spectrum, 34(12), 18–30. doi:10.1109/6.642965.
Park, S., Ban, S., Hong, D. R., & Yeo, W. S. (2013). Sound surfing network (SSN): Mobile phone-based sound spatialization with audience collaboration. In Proceedings (pp. 111–14). Korea.
Perez-Lopez, A. (2015). 3DJ: A supercollider framework for real-time sound spatialization. In Proceedings. Graz, Austria.
Peters, N. (2011). Sweet [Re] production: Developing sound spatialization tools for musical applications with emphasis on sweet spot and off-center perception. McGill University.
Peters, N., Lossius, T., Schacher, J., Baltazar, P., Bascou, C., & Place, T. (2009). A stratified approach for sound spatialization (pp. 219–24). Citeseer.
Pressing, J. (1990). Cybernetic issues in interactive performance systems. Computer Music Journal, 14(1), 12–25.
Roads, C. (1996). The computer music tutorial. MIT press.
Rzewski, F. (1968). A photoresistor mixer for live performance. Electronic Music Review, 4(4).
Sannicandro, V. (2014). Space and spatialization as discrete parameter in music composition: A space-oriented approach to écriture; from acoustic through informatics to musical notation. Berlin: epubli GmbH, Berlin.
Schacher, J. C. (2007). Gesture control of sounds in 3D space. In Proceedings of the 7th International Conference on New Interfaces for Musical Expression (pp. 358–362). NIME ’07. New York, NY, USA: ACM. doi:10.1145/1279740.1279819
Stockhausen, K. (1959). Musik Im Raum. Die Reihe, Berichte—Analyse, 5, 67–72.
Torre, G., Sazdov, R., & Konczewska, D. (2009). MOLITVA–Composition for Voice, Live Electronics, Pointing-At Glove Device and 3-D Setup of Speakers. Proceedings of the International Conference on New Interfaces for Musical Expression (NIME), 330.
Torre, G. (2013). The design of a new musical glove: A live performance approach. Limerick: University of Limerick.
Valiquet, P. (2011). The spatialisation of stereophony: Taking positions in post-war electroacoustic music. In International computer music conference proceedings 2011.
Wanderley, M. M., & Orio, N. (2002). Evaluation of input devices for musical expression: Borrowing tools from Hci. Computer Music Journal, 26(3), 62–76.
Xenakis, I. (1992). Formalized music: Thought and mathematics in music, revised edition. Stuyvesant.
Zvonar, R. (2000). An extremely brief history of spatial music in the 20th century. Surround Professional Magazine.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2017 Springer Nature Singapore Pte Ltd.
About this chapter
Cite this chapter
Pysiewicz, A., Weinzierl, S. (2017). Instruments for Spatial Sound Control in Real Time Music Performances. A Review. In: Bovermann, T., de Campo, A., Egermann, H., Hardjowirogo, SI., Weinzierl, S. (eds) Musical Instruments in the 21st Century. Springer, Singapore. https://doi.org/10.1007/978-981-10-2951-6_18
Download citation
DOI: https://doi.org/10.1007/978-981-10-2951-6_18
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-10-2950-9
Online ISBN: 978-981-10-2951-6
eBook Packages: EngineeringEngineering (R0)