Skip to main content

Intensity Shaping in Sustained Notes Encodes Metrical Cues for Synchronization in Ensemble Performance

  • Conference paper
  • First Online:
Sound, Music, and Motion (CMMR 2013)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 8905))

Included in the following conference series:

  • 1909 Accesses

Abstract

We investigate the use of musical prosody as a coordination strategy in ensemble performance, focussing on the metrically ambiguous case of long sustained notes, where little rhythmic information can be obtained from the sparse note onsets. Using cluster analysis of the amplitude power curves of long notes in recordings of a violin-cello duo, we examine the use of varying intensity to communicate timing information over the duration of these sustained notes. The elbow method provides the optimal number of clusters, and we present the common intensity shapes employed by the violinist and by the cellist. Analysis of peaks in the intensity curves uncovers correspondences between peak positions and natural subdivisions of musical time: performers tend to use consistent curve shapes that peak at metrical subdivisions within notes, namely, at around the 0.25 and 0.75 points of the note event. We hypothesize that the 0.75 point intensity peak functions as an upbeat indicator and the 0.25 point peak serves to propel the note forward. We surmise that knowledge of the placements of these intensity peaks may be useful as auditory cues for marking musical time in held notes, and discuss how this knowledge might be exploited in anticipatory accompaniment systems and audio-score alignment.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Cannam, C., Landone, C., Sandler, M.: Sonic visualiser: An open source application for viewing, analysing, and annotating music audio files. In: Proceedings of the ACM Multimedia 2010 International Conference (2010)

    Google Scholar 

  2. Dixon, S.: Live tracking of musical performances using on-line time warping. In: Proceedings of the 8th International Conference on Digital Audio Effects (DAFx05) (2005)

    Google Scholar 

  3. Godoy, R.I., Leman, M.: Musical Gestures Sound, Movement and Meaning. Routledge, New York (2010)

    Google Scholar 

  4. Keller, P.E., Appel, M.: Individual differences, auditory imagery, and the coordination of body movements and sounds in musical ensembles. Music Percept. Interdisc. J. 28, 27–46 (2010)

    Article  Google Scholar 

  5. Levine, M., Schober, M.: Visual and auditory cues in jazz musicians. In: International Symposium on Performance Science (2011)

    Google Scholar 

  6. Lim, A.: Robot musical accompaniment: Real time synchronisation using visual cue recognition. In: Proceedings of (IROS) IEEE/RSJ International Conference on Intelligent RObots and Systems (2010)

    Google Scholar 

  7. MacQueen, J.: Some methods for classification and analysis of multivariate observations. In: Proceedings of 5th Berkeley Symposium on Mathematical Statistics and Probability (1967)

    Google Scholar 

  8. McCaleb, M.: Communication or interaction? applied environmental knowledge in ensemble performance. In: Proceedings of the CMPCP Performance Studies Network International Conference (2011)

    Google Scholar 

  9. Ng, A.: Clustering with the k-means algorithm. http://calypso.inesc-id.pt/docs/KM1.pdf

  10. Ritter, M., Hamel, K., Pritchard, B.: Integrated multimodal score-following environment. In: Proceedings of the International Computer Music Conference (2013)

    Google Scholar 

  11. Stowell, D., Chew, E.: Bayesian map estimation of piecewise arcs in tempo time-series. In: Proceedings of the International Symposium on Computer Music Modeling and Retrieval (2012)

    Google Scholar 

  12. Vera, B., Chew, E., Healey, P.G.T.: A study of ensemble synchronisation under restricted line of sight. In: Proceedings of the 14th International Society for Music Information Retrieval Conference (2013)

    Google Scholar 

  13. Vines, B.W., Wanderley, M.M., Krumhansl, C.L., Nuzzo, R.L., Levitin, D.J.: Performance gestures of musicians: what structural and emotional information do they convey? In: Camurri, A., Volpe, G. (eds.) GW 2003. LNCS (LNAI), vol. 2915, pp. 468–478. Springer, Heidelberg (2004)

    Chapter  Google Scholar 

  14. Williamon, A.: Coordinating duo piano performance. In: Proceedings of the Sixth International Conference on Music Perception and Cognition (2000)

    Google Scholar 

  15. Yang, L., Rajab, K., Chew, E.: Vibrato performance style: A case study comparing erhu and violin. In: Proceedings of the International Conference on Computer Music Modeling and Retrieval (2013)

    Google Scholar 

  16. Yoder, N.: Peakfinder. www.mathworks.co.uk/matlabcentral/fileexchange/25500-peakfinder

Download references

Acknowledgments

The authors thank Kathleen Agres and Laurel Pardue for their participation in the line of sight ensemble interaction experiments. This research was funded in part by the Engineering and Physical Sciences Research Council (EPSRC).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Bogdan Vera .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2014 Springer International Publishing Switzerland

About this paper

Cite this paper

Vera, B., Chew, E. (2014). Intensity Shaping in Sustained Notes Encodes Metrical Cues for Synchronization in Ensemble Performance. In: Aramaki, M., Derrien, O., Kronland-Martinet, R., Ystad, S. (eds) Sound, Music, and Motion. CMMR 2013. Lecture Notes in Computer Science(), vol 8905. Springer, Cham. https://doi.org/10.1007/978-3-319-12976-1_12

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-12976-1_12

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-12975-4

  • Online ISBN: 978-3-319-12976-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics