Advertisement

Piano Technique as a Case Study in Expressive Gestural Interaction

  • Andrew P. McPherson
  • Youngmoo E. Kim
Chapter
Part of the Springer Series on Cultural Computing book series (SSCC)

Abstract

There is a longstanding disconnect between mechanical models of the piano, in which key velocity is the sole determinant of each note’s sound, and the subjective experience of trained pianists, who take a nuanced, multidimensional approach to physical gestures at the keyboard (commonly known as “touch”). We seek to peel back the abstraction of the key press as a discrete event, developing models of key touch that link qualitative musical intention to quantitative key motion. The interaction between performer and instrument (whether acoustic or electronic) can be considered a special case of human-machine interaction, and one that takes place on far different terms than ordinary human-computer interaction: a player’s physical gestures are often the result of intuitive, subconscious processes. Our proposed models will therefore aid the development of computer interfaces which connect with human users on an intuitive, expressive level, with applications within and beyond the musical domain.

Keywords

Physical Gesture Physical Execution Piano Performance Expressive Intent Piano Keyboard 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Notes

Acknowledgments

This material is based upon work supported by the US National Science Foundation under Grant #0937060 to the Computing Research Association for the CIFellows Project.

References

  1. Berman, B. (2000). Notes from the pianist’s bench. New Haven/London: Yale University Press.Google Scholar
  2. Camurri, A., Mazzarino, B., & Volpe, G. (2004). Analysis of expressive gesture: The EyesWeb expressive gesture processing library. In A. Camurri & G. Volpe (Eds.), Gesture-based communication in human-computer interaction (Vol. 2915, pp. 469–470). Berlin/Heidelberg: Springer.CrossRefGoogle Scholar
  3. Castellano, G., Mortillaro, M., Camurri, A., Volpe, G., & Scherer, K. (2008). Automated analysis of body movement in emotionally expressive piano performances. Music Perception, 26(2), 103–120.CrossRefGoogle Scholar
  4. Cechanowicz, J., Irani, P., & Subramanian, S. (2007). Augmenting the mouse with pressure sensitive input. In Proceedings of the 25th ACM Conference on Human Factors in Computing Systems (CHI). San Jose, CA, USA.Google Scholar
  5. Clarke, E. F. (2004). Empirical methods in the study of performance. In E. F. Clarke & N. Cook (Eds.), Empirical musicology: Aims, methods, prospects (pp. 77–102). Oxford: Oxford University Press.CrossRefGoogle Scholar
  6. Dietz, P., Eidelson, B., Westhues, J., & Bathiche, S. (2009). A practical pressure sensitive computer keyboard. In Proceedings of the 22nd Symposium on User Interface Software and Technology (UIST). Victoria, BC, Canada.Google Scholar
  7. Doğantan-Dack, M. (2011). In the beginning was gesture: Piano touch and the phenomenology of the performing body. In A. Gritten & E. King (Eds.), New perspectives on music and gesture (pp. 243–266). Farnham: Ashgate.Google Scholar
  8. Gerig, R. (2007). Famous pianists & their technique. Bloomington: Indiana University Press.Google Scholar
  9. Goebl, W., & Bresin, R. (2001). Are computer-controlled pianos a reliable tool in music performance research? Recording and reproduction precision of a Yamaha Disklavier grand piano. In Proceedings of the MOSART workshop, Barcelona.Google Scholar
  10. Goebl, W., Bresin, R., & Galembo, A. (2004). Once again: The perception of piano touch and tone. Can touch audibly change piano sound independently of intensity? In Proceedings of the International Symposium on Musical Acoustics. Nara, Japan.Google Scholar
  11. Goebl, W., Dixon, S., De Poli, G., Friberg, A., Bresin, R., & Widmer, G. (2008). Sense in expressive music performance. In P. Polotti & D. Rocchesso (Eds.), Sound to sense – sense to sound: A state of the art in sound and music computing (pp. 195–242). Berlin GmbH: Logos.Google Scholar
  12. McPherson, A. (2010). The magnetic resonator piano: Electronic augmentation of an acoustic grand piano. Journal of New Music Research, 39(3), 189–202.CrossRefGoogle Scholar
  13. McPherson, A., & Kim, Y. (2010). Augmenting the acoustic piano with electromagnetic string actuation and continuous key position sensing. In Proceedings of the International Conference on New Interfaces for Musical Expression (NIME). Sydney, Australia.Google Scholar
  14. McPherson, A., & Kim, Y. (2011). Multidimensional gesture sensing at the piano. In Proceedings of the 29th ACM conference on human factors in computing systems (CHI). Vancouver, BC, Canada.Google Scholar
  15. Ortmann, O. (1925). The physical basis of piano touch and tone. London: Kegan Paul, Trenc, Trubner & Co.Google Scholar
  16. Parncutt, R., & Troup, M. (2002). Piano. In R. Parncutt & G. McPherson (Eds.), Science and psychology of music performance: Creative strategies for teaching and learning (pp. 285–302). Oxford: Oxford University Press.CrossRefGoogle Scholar
  17. Prior, H. (2011). Links between music and shape: Style-specific, language-specific, or universal? In Proceedings of Topics in Musical Universals: 1st International Colloquium. Aix-en-Provence, France.Google Scholar
  18. Repp, B. (1996). Patterns of note onset asynchronies in expressive piano performance. Journal of the Acoustical Society of America, 100, 3917–3932.CrossRefGoogle Scholar
  19. Rink, J. (2004). The state of play in performance studies. In J. W. Davidson (Ed.), The music practitioner: Research for the music performer, teacher, and listener (pp. 37–51). Aldershot: Ashgate.Google Scholar
  20. Suzuki, H. (2007). Spectrum analysis and tone quality evaluation of piano sounds with hard and soft touches. Acoustical Science and Technology, 28, 1–6.CrossRefGoogle Scholar
  21. Wanderley, M., & Depalle, P. (2004). Gestural control of sound synthesis. Proceedings of the IEEE, 92(4), 632–644.CrossRefGoogle Scholar
  22. Wang, F., Cao, X., Ren, X., & Irani, P. (2009). Detecting and leveraging finger orientation for interaction with direct-touch surfaces. In Proceedings of the 22nd Symposium on User Interface Software and Technology (UIST). Vancouver, BC, Canada.Google Scholar

Copyright information

© Springer-Verlag London 2013

Authors and Affiliations

  1. 1.Centre for Digital Music, School of Electronic Engineering and Computer ScienceQueen Mary, University of LondonLondonUK
  2. 2.Music Entertainment Technology Laboratory, Department of Electrical and Computer EngineeringDrexel UniversityPhiladelphiaUSA

Personalised recommendations