Skip to main content

2002: The Importance of Parameter Mapping in Electronic Instrument Design

  • Chapter
  • First Online:
A NIME Reader

Part of the book series: Current Research in Systematic Musicology ((CRSM,volume 3))

Abstract

In this paper we challenge the assumption that an electronic instrument consists solely of an interface and a sound generator. We emphasise the importance of the mapping between input parameters and system parameters, and claim that this can define the very essence of an instrument.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 189.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 249.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 249.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  • Arfib, D. (2002). Personal communication.

    Google Scholar 

  • Bongers, B. (2000). Physical interfaces in the electronic arts. Interaction theory and interfacing techniques for real-time performance. In M. Wanderley, & M. Battier (Eds.), Trends in Gestural control of music. Paris: IRCAM—Centre Pompidou.

    Google Scholar 

  • Buxton, W. (1986). There’s more interaction than meets the eye: Some issues in manual input. In D. Norman & S. W. Draper (Eds.), User Centered System Design: New Perspectives on Human-Computer Interaction (pp. 319–337). Hillsdale, N.J: Lawrence Erlbaum Associates.

    Google Scholar 

  • Chadabe, J. (2002). The limitations of mapping as a structural descriptive in electronic instruments. In Proceedings of the International Conference on New Interfaces for Musical Expression, Dublin, Ireland.

    Google Scholar 

  • Garnett, G., & Goudeseune, C. (1999). Performance factors in control of high-dimensional spaces. In Proceedings of the 1999 International Computer Music Conference (pp. 268–271), San Francisco, CA.

    Google Scholar 

  • Hunt, A. (1999). Radical user interfaces for real-time musical control. PhD thesis, University of York, UK.

    Google Scholar 

  • Hunt, A., & Kirk, R. (2000). Mapping strategies for musical performance. In M. Wanderley, & M. Battier (Eds.), Trends in Gestural Control of Music. Paris: IRCAM—Centre Pompidou.

    Google Scholar 

  • Hunt, A., Wanderley, M. M., & Kirk, R. (2000). Towards a model for instrumental mapping in expert musical interaction. In Proceedings of the International Computer Music Conference (pp. 209–212), San Francisco, CA.

    Google Scholar 

  • Métois, E. (1996). Musical sound information: Musical gestures and embedding systems. PhD thesis, MIT Media Lab.

    Google Scholar 

  • Miranda, E. R., & Wanderley, M. M. (2006). New digital musical instruments: Control and interaction beyond the keyboard. Middleton, WI: A-R Editions.

    Google Scholar 

  • Mulder, A., & Fels, S. (1998). Sound sculpting: Manipulating sound through virtual sculpting. In Proceedings of the 1998 Western Computer Graphics Symposium (pp. 15–23).

    Google Scholar 

  • Mulder, A., Fels, S., & Mase., K. (1997). Empty-handed gesture analysis in max/fts. In A. Camurri (Ed.), Kansei, The Technology of Emotion. Proceedings of the AIMI International Workshop (pp. 87–91). Genoa: Associazione di Informatica Musicale Italiana.

    Google Scholar 

  • Rovan, J. B., Wanderley, M. M., Dubnov, S., & Depalle, P. (1997). Instrumental gestural mapping strategies as expressivity determinants in computer music performance. In A. Camurri (Ed.), Kansei, The Technology of Emotion. Proceedings of the AIMI International Workshop (pp. 68–73). Associazione di Informatica Musicale Italiana: Genoa.

    Google Scholar 

  • Van Nort, D. (2010). Modular and adaptive control of sound processing. PhD thesis, McGill University, Montreal, Canada, QC.

    Google Scholar 

  • Wanderley, M. M. (2001). Performer-Instrument Interaction. Application to Gestural Control of Sound Synthesis. PhD thesis, University Paris VI, France.

    Google Scholar 

  • Wanderley, M. M. (2002). Mapping strategies for real-time computer music. special issue. In M. M. Wanderley (Ed.), Organised Sound (vol. 7).

    Google Scholar 

  • Wanderley, M. M., & Battier, M. (Eds.). (2000). Trends in gestural control of music. Centre Pompidou: IRCAM.

    Google Scholar 

  • Wanderley, M. M., & Depalle, P. (1999). Contrôle gestuel de la synthèse sonore. In H. Vinet, & F. Delalande (Eds.), Interfaces Homme–Machine et creation musicale (pp. 145–163). Hermes Science Publishing.

    Google Scholar 

  • Wanderley, M. M., & Malloch, J. (2014). Advances in the design of mapping for computer music. Computer Music Journal, 38(3), 4–5.

    Article  Google Scholar 

  • Wanderley, M. M., Schnell, N., & Rovan, J. E. (1998). Modeling and performing ‘composed instruments’ in real-time. Proceedings of the IEEE International Conference on Systems, Man and Cybernetics (pp. 1080–1084). CA: San Diego.

    Google Scholar 

  • Wessel, D. (1979). Timbre space as a musical control structure. Computer Music Journal, 3(2), 45–52.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Andy Hunt .

Editor information

Editors and Affiliations

Appendices

Author Commentary: Reflections on Parameter Mapping in Electronic Instrument Design

Andy Hunt and Marcelo M. Wanderley

In the 80s and early 90s many papers were published on the development of digital musical instruments (DMIs) and interfaces for musical expression, mostly at conferences such as the International Computer Music Conference (ICMC) and in the Computer Music Journal, both starting in the mid-late 70s. A number of amazing instruments and methods were proposed, many of them subsequently summarized in the book “New Digital Musical Instruments: Control and interaction beyond the keyboard” Miranda and Wanderley (2006). These papers mostly presented novel interfaces and/or sound synthesis methods for musical control, but few specifically addressed the issues involved in how to map interface outputs to synthesis inputs. One-to-one arbitrary mappings were the norm, but notable exceptions included the works of Ian Bowler et al. (ICMC1990), Michael Lee and David Wessel (ICMC1992), Insook Choi et al. (ICMC1995), Stuart Favilla (ICMC1996) and Axel Mulder et al. (Workshop Kansei 1997).

This NIME 2002 paper arose from a collaboration between Andy Hunt and Marcelo Wanderley (with Matthew Paradis, a D.Phil. student of Hunt) in the late 90s and early 2000s. Both researchers were among the first to study and begin to formalize the role of mapping in digital musical instruments (DMIs). In this paper, the questions we tackle are as follows: What if more complex strategies are used instead of one-to-one mappings, such as what happens in many acoustic musical instruments? How can one make sense of the various mapping possibilities in DMIs (e.g. implicit, explicit)? What is the influence of the choice of mapping on DMI design/performance?

Wanderley started to look at the importance of mapping in DMI design in 1997, when in collaboration with Butch Rovan, Shlomo Dubnov and Philippe Depalle he presented a paper at the International Workshop on “Kansei, The Technology of Emotion,” organized by Antonio Camurri in Genoa, Italy. In 1998, in collaboration with Norbert Schnell and Butch Rovan, he published a sequence paper on mapping strategies in the software application “Escher,” developed at IRCAM. In this paper, the notion of mapping layers was proposed to formalize and articulate the definition of mapping strategies.

At the same time, Hunt was working on his D.Phil. thesis also on mapping, under the supervision of Ross Kirk at the University of York, UK. His work was the first (and up to now possibly the only one) to have looked at the long-term performance with different mapping strategies. Apart from this major contribution, the main difference between these works was that, while Wanderley et al. were inspired by a known mapping model (that of a clarinet), Hunt and Kirk dealt with arbitrary mapping choices (i.e. with no known models to base the mapping on).

In 1999, through the project CUIDAD, organized by IRCAM and with the participation of the University of York, both had the opportunity to collaborate in joining their respective work on mapping. A first result was a paper published at ICMC 2000 in Berlin (Hunt et al. 2000). They continued to work on this topic through a few more papers, including the one in NIME 2002. One for Organised Sound and one for the Journal of New Music Research, directly derived from the NIME 2002 paper. Altogether, these papers were cited more than 500 times (Google Scholar, Sept 28 2015).

One interesting issue arose at the conference, held at the former Media Lab Europe, in Dublin, Ireland. One of the keynote speakers was interactive music pioneer Joel Chadabe, who gave a talk with the title “The Limitations of Mapping as a Structural Descriptive in Electronic Instruments,” a very interesting contrast to the title we proposed (“The importance of Parameter Mapping in Electronic Instrument Design”). This is mostly due to the notion of an “electronic instrument,” a reactive device in Hunt and Wanderley’s view, but an “interactive system” in Chadabe’s (see Bert Bongers definition of interactive and reactive systems (Bongers 2000).

Since then, Wanderley has edited (or co-edited) two special journal editions on Mapping in DMIs (Organised Sound vol. 7, 2002, and recently in the Computer Music Journal, vol. 38, 2014, with Joseph Malloch) and published a few more works focusing on mapping. Hunt has created and made several contributions to the field of “interactive sonification,” mostly in collaboration with Thomas Hermann (University of Bielefeld).

While much has been done since the paper was first written, mapping is still an open field of research, mostly in terms of tools to help define mapping strategies in DMIs (see Wanderley and Malloch (2014) for updated information). Work is still needed to understand the implications of mapping choices in short and long-term performance with DMIs.

Expert Commentary: Listen to the Inner Complexities

Doug Van Nort

Fairly early in the development of interactive computer music, people had already been thinking about user manipulation of all facets of digital sound, with a major early voice in this area being David Wessel through his idea of timbre space as a musical control structure (Wessel 1979). By the late 90s and early 2000s, ever-increasing processing speeds had allowed for high-quality sound synthesis to run in real-time. This led to an explosion in research on input devices and sensor technologies for expressive control of these various sound creation algorithms, moving beyond the limiting standard of the MIDI keyboard.

As research raced forward in the areas of real-time sound synthesis and sensor-based controllers, a group of researchers rightly recognized that all of the essential algorithmic decisions that associate control input with sound output—the area of mapping—was being overlooked. They looked towards the more mature field of Human Computer Interaction (HCI) for inspiration, and the work of researchers such as William Buxton in order to more fully understand the ergonomic and psychological influence that simply changing a mapping can have on a user, or in this case performer.

The article by Hunt, Wanderley and Paradis is particularly important as it represents the coming together of two early research streams. Hunt’s and Wanderley’s respective dissertation projects and subsequent research are two of the earliest to tackle this area of inquiry, as is well represented in the “Trends in Gestural Control of Music” collection (Wanderley and Battier 2000) as well as a special issue of Organised Sound (Wanderley 2002).

This NIME 2002 article serves two key purposes: the first is that it gives an important brief introduction to these preceding projects; the second is that it attempts to establish and articulate the importance of mapping by formally exploring this through quantitative and qualitative user studies. While one could level a fair criticism by noting that the studies presented lack the depth and methodological rigor of many found within the world of HCI, the fact remains that this work opened up the question of how the ostensibly simple act of control-to-sound parameter associations could profoundly impact the “feel” of the digital musical instrument. Various works have followed up on this early research including my own dissertation (Van Nort 2010), largely devoted to expanding this conversation on the topic of mapping.

Certainly not all members of the NIME community have agreed with the approach of the 2002 Hunt et al. work: some maintain that mapping is purely an artistic decision of the composer/designer, while others maintain the spirit of this article, that it is an essential determinant of expressivity for any performer who comes into contact with a given digital musical instrument. My personal view is that, as with most things, the truth lies somewhere in between and both are correct depending on the musical intent. If we are talking about a complex interactive system that a performer is simply guiding, then perhaps mapping is even a limiting concept, as proposed by Chadabe (2002). However, if we are talking about the continuous shaping of sonic activity with a goal of achieving nuanced control and expression that rivals acoustic instrumental practice, then this question of “what to map where” and how various control signals are extracted, gated, associated, adapted, conditioned, etc. can have profound impact on a digital performer’s ability to keep up with their analog musical vision.

This article has played no small role in opening up this conversation in the context of the NIME community. As we move forward with ever more sophisticated techniques of machine learning at our disposal, I hope that future NIME creators will remember the profound influence that can be wrought by a subtle change in transfer function, interpolation method, or cross-coupling of parameters, and that one should take time to listen to the inner complexities of their system’s behaviour with their musical ears, mind and body. Further to this point, I strongly suggest that freezing the instrumental system’s state and deeply engaging with performance in a variety of improvised contexts will present the true challenges and roadblocks to one’s performative voice. It may be that a new method of gestural sensing or sonic output is required, but do not overlook the gesturality and dynamism inherent in the small choices that lie in between.

Rights and permissions

Reprints and permissions

Copyright information

© 2017 Springer International Publishing AG

About this chapter

Cite this chapter

Hunt, A., Wanderley, M.M., Paradis, M. (2017). 2002: The Importance of Parameter Mapping in Electronic Instrument Design. In: Jensenius, A., Lyons, M. (eds) A NIME Reader. Current Research in Systematic Musicology, vol 3. Springer, Cham. https://doi.org/10.1007/978-3-319-47214-0_3

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-47214-0_3

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-47213-3

  • Online ISBN: 978-3-319-47214-0

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics