Abstract
The diversity and complexity of Digital Musical Instruments often lead to a reduced appreciation of live performances by the audience. This can be linked to the lack of familiarity they have with the instruments. We propose to increase this familiarity thanks to a transdisciplinary approach in which signals from both the musician and the audience are extracted, familiarity analyzed, and augmentations dynamically added to the instruments. We introduce a new decomposition of familiarity and the concept of correspondences between musical gestures and results. This paper is both a review of research that paves the way for the realization of a pipeline for augmented familiarity, and a call for future research on the identified challenges that remain before it can be implemented.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Armitage, J.: Revealing timelines: live coding and its gestures. In: Proceedings of ICLC (2016)
Barbosa, J., Calegario, F., Teichrieb, V., Ramalho, G., McGlynn, P.: Considering Audience’s view towards an evaluation methodology for digital musical instruments. In: Proceedings of NIME (2012)
Baytas, M.A., GÖksun, T., Özcan, O.: The perception of live-sequenced electronic music via hearing and sight. In: Proceedings of the International Conference on New Interfaces for Musical Expression, 2220–4806, vol. 16, pp. 194–199. Queensland Conservatorium Griffith University, Brisbane, Australia (2016). http://www.nime.org/proceedings/2016/nime2016_paper0040.pdf
Baytas, M.A., Göksun, T., Özcan, O.: The perception of live-sequenced electronic music via hearing and sight. In: Proceedings of NIME (2016)
Bellotti, V., Back, M., Edwards, W.K., Grinter, R.E., Henderson, A., Lopes, C.: Making sense of sensing systems: five questions for designers and researchers. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 415–422. ACM (2002)
Berners-Lee, T., Hendler, J., Lassila, O., et al.: The semantic web. Sci. Am. 284(5), 28–37 (2001)
Berthaut, F., Coyle, D., Moore, J., Limerick, H.: Liveness through the lens of agency and causality. In: Proceedings of NIME (2015)
Berthaut, F., Marshall, M.T., Subramanian, S., Hachet, M.: Rouages: revealing the mechanisms of digital musical instruments to the audience. In: Proceedings of NIME (2013)
Berthaut, F., Martinez Plasencia, D., Hachet, M., Subramanian, S.: Reflets: combining and revealing spaces for musical performances. In: Proceedings of NIME (2015). https://hal.inria.fr/hal-01136857
Bin, S.A., Bryan-Kinns, N., McPherson, A.P.: Skip the pre-concert demo: how technical familiarity and musical style affect audience response. In: Proceedings of NIME (2016)
Bin, S.A., Bryan-Kinns, N., McPherson, A., et al.: Hands where we can see them! investigating the impact of gesture size on audience perception. International Computer Music Conference (2017)
Astrid Bin, S.M., Morreale, F., Bryan-Kinns, N., McPherson, A.P.: In-the-moment and beyond: combining post-hoc and real-time data for the study of audience perception of electronic music performance. In: Bernhaupt, R., Dalvi, G., Joshi, A., Balkrishan, D.K., O’Neill, J., Winckler, M. (eds.) INTERACT 2017. LNCS, vol. 10513, pp. 263–281. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-67744-6_18
Cadoz, C., Wanderley, M.M.: Gesture-music (2000)
Calvo-Merino, B., Glaser, D.E., Grèzes, J., Passingham, R.E., Haggard, P.: Action observation and acquired motor skills: an fmRI study with expert dancers. Cereb. Cortex 15(8), 1243–1249 (2005)
Chanel, G., Bétrancourt, M., Pun, T., Cereghetti, D., Molinari, G.: Assessment of computer-supported collaborative processes using interpersonal physiological and eye-movement coupling. In: 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction (ACII), pp. 116–122. IEEE (2013)
Fels, S., Gadd, A., Mulder, A.: Mapping transparency through metaphor: towards more expressive musical instruments. Organ. Sound 7(2), 109–126 (2002)
Fyans, A.C., Gurevich, M.: Perceptions of skill in performances with acoustic and electronic instruments. In: Proceedings of the International Conference on New Interfaces for Musical Expression, Oslo, Norway, pp. 495–498 (2011). http://www.nime.org/proceedings/2011/nime2011_495.pdf
Fyans, A.C., Gurevich, M., Stapleton, P.: Where did it all go wrong? A model of error from the spectator’s perspective. In: Proceedings of the International Conference on New Interfaces for Musical Expression, Pittsburgh, PA, United States, pp. 171–172 (2009). http://www.nime.org/proceedings/2009/nime2009_171.pdf
Fyans, A.C., Gurevich, M., Stapleton, P.: Examining the spectator experience. In: Proceedings of the International Conference on New Interfaces for Musical Expression, Sydney, Australia, pp. 451–454 (2010). http://www.nime.org/proceedings/2010/nime2010_451.pdf
Godøy, R.I., et al.: Classifying music-related actions (2012)
Jacob, R.J., et al.: Reality-based interaction: a framework for post-wimp interfaces. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI 2008, pp. 201–210. ACM, New York (2008). https://doi.org/10.1145/1357054.1357089
Jensenius, A.R., Wanderley, M.M., Godøy, R.I., Leman, M.: Musical gestures. In: Musical Gestures: Sound, Movement, and Meaning, December 2009
Kim, J., André, E.: Emotion recognition based on physiological changes in music listening. IEEE Trans. Pattern Anal. Mach. Intell. 30(12), 2067–2083 (2008)
Koelstra, S., et al.: Deap: a database for emotion analysis; using physiological signals. IEEE Trans. Affect. Comput. 3(1), 18–31 (2012)
Kohler, E., Keysers, C., Umiltà , M.A., Fogassi, L., Gallese, V., Rizzolatti, G.: Hearing sounds, understanding actions: action representation in mirror neurons. Science 297(5582), 846–8 (2002). http://www.ncbi.nlm.nih.gov/pubmed/12161656
Lai, C.H., Bovermann, T.: Audience experience in sound performance. In: Proceedings of NIME (2013)
Lai, C.H., Bovermann, T.: Audience experience in sound performance. In: NIME, pp. 170–173 (2013). http://www.nime.org/2013/program/papers/day2/paper4/197/197_Paper.pdf
Leman, M., Maes, P.J.: The role of embodiment in the perception of music. Empir. Music. Rev. 9(3–4), 236–246 (2014)
Loftus, E.F., Palmer, J.C.: Reconstruction of automobile destruction: an example of the interaction between language and memory. J. Verbal Learn. Verbal Behav. 13(5), 585–589 (1974)
Molnar-Szakacs, I., Overy, K.: Music and mirror neurons: from motion to ‘e’motion. Soc. Cogn. Affect. Neurosci. 1(3), 235–241 (2006)
Murray-Browne, T., Mainstone, D., Bryan-Kinns, N., Plumbley, M.D.: The medium is the message: composing instruments and performing mappings. In: Proceedings of the International Conference on New Interfaces for Musical Expression, pp. 56–59 (2011)
Paulus, J., Müller, M., Klapuri, A.: State of the art report: audio-based music structure analysis. In: Proceedings of ISMIR (2010)
Perrotin, O., d’Alessandro, C.: Visualizing gestures in the control of a digital musical instrument. In: Proceedings of the International Conference on New Interfaces for Musical Expression, pp. 605–608. Goldsmiths, University of London, London (2014). http://www.nime.org/proceedings/2014/nime2014_406.pdf
Rautaray, S.S., Agrawal, A.: Vision based hand gesture recognition for human computer interaction: a survey. Artif. Intell. Rev. 43(1), 1–54 (2015)
Ringeval, F., Sonderegger, A., Sauer, J., Lalanne, D.: Introducing the RECOLA multimodal corpus of remote collaborative and affective interactions. In: 2013 10th IEEE International Conference and Workshops on Automatic Face and Gesture Recognition (FG), pp. 1–8. IEEE (2013)
Sammler, D., Grigutsch, M., Fritz, T., Koelsch, S.: Music and emotion: electrophysiological correlates of the processing of pleasant and unpleasant music. Psychophysiology 44(2), 293–304 (2007)
Schacher, J.C., Neff, P.: Skill development and stabilisation of expertise for electronic music performance. In: Kronland-Martinet, R., Aramaki, M., Ystad, S. (eds.) CMMR 2015. LNCS, vol. 9617, pp. 111–131. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46282-0_7
Schubert, E., Ferguson, S., Farrar, N., Taylor, D., McPherson, G.E.: The six emotion-face clock as a tool for continuously rating discrete emotional responses to music. In: Aramaki, M., Barthet, M., Kronland-Martinet, R., Ystad, S. (eds.) CMMR 2012. LNCS, vol. 7900, pp. 1–18. Springer, Heidelberg (2013). https://doi.org/10.1007/978-3-642-41248-6_1
Sequeira, H., Hot, P., Silvert, L., Delplanque, S.: Electrical autonomic correlates of emotion. Int. J. Psychophysiol. 71(1), 50–56 (2009)
Snoek, C.G., Worring, M.: Multimodal video indexing: a review of the state-of-the-art. Multimed. Tools Appl. 25(1), 5–35 (2005)
Swan, M.: The quantified self: fundamental disruption in big data science and biological discovery. Big Data 1(2), 85–99 (2013)
Vines, B.W., Krumhansl, C.L., Wanderley, M.M., Dalca, I.M., Levitin, D.J.: Music to my eyes: cross-modal interactions in the perception of emotions in musical performance. Cognition 118(2), 157–170 (2011)
Wegner, D.M., Wheatley, T.: Apparent mental causation: sources of the experience of will. Am. Psychol. 54(7), 480 (1999)
Wu, J.C., Huberth, M., Yeh, Y.H., Wright, M.: Evaluating the audience’s perception of real-time gestural control and mapping mechanisms in electroacoustic vocal performance. In: Proceedings NIME (2016)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer Nature Switzerland AG
About this paper
Cite this paper
Capra, O., Berthaut, F., Grisoni, L. (2018). Toward Augmented Familiarity of the Audience with Digital Musical Instruments. In: Aramaki, M., Davies , M., Kronland-Martinet, R., Ystad, S. (eds) Music Technology with Swing. CMMR 2017. Lecture Notes in Computer Science(), vol 11265. Springer, Cham. https://doi.org/10.1007/978-3-030-01692-0_37
Download citation
DOI: https://doi.org/10.1007/978-3-030-01692-0_37
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-01691-3
Online ISBN: 978-3-030-01692-0
eBook Packages: Computer ScienceComputer Science (R0)