Advertisement

Autocorrelation of Pitch-Event Vectors in Meter Finding

  • Christopher Wm. WhiteEmail author
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11502)

Abstract

Computational researchers often use autocorrelation techniques to identify the meter of a musical passage, tracking the ebs and flows of loudness or –if using symbolic data– peaks and valleys of note attacks. This paper investigates the relative success of various harmonic and pitch events compared to a note-attack model when identifying musical meter using autocorrelation. This study implements such a process using several different parameters: note attacks, pitch class change, set class probabilities, and scale-degree set probabilities. These outputs are measured against a ground truth derived from each piece’s notated time signature. The relative success of each parameter is tracked using F scores. While the study shows that loudness-oriented parameters are overall more successful, the paper discusses how its findings add to our understanding of musical meter and the role played by pitch parameters in metric accents.

Keywords

Computation Corpus analysis Modeling Meter 

References

  1. 1.
    Palmer, C., Krumhansl, C.: Mental representations of musical meter. J. Exp. Psychol. Hum. Percept. Perform. 16, 728–741 (1990)CrossRefGoogle Scholar
  2. 2.
    Brown, J.C.: The determination of meter of musical scores by autocorrelation. J. Acoust. Soc. Am. 94(4), 1953–1957 (1993)CrossRefGoogle Scholar
  3. 3.
    Eck, D.: Identifying metrical and temporal structure with an autocorrelation phase matrix. Music Percept. 24(2), 167–176 (2006)CrossRefGoogle Scholar
  4. 4.
    Gouyon, F., Dixon, S.: A review of automatic rhythm‬ description systems. Comput. Music J. 29, 34–54‬ (2005)CrossRefGoogle Scholar
  5. 5.
    Boone, G.M.: Marking mensural time. Music Theor. Spect. 22(1), 1–43 (2000)CrossRefGoogle Scholar
  6. 6.
    Zikanov, K.: Metric properties of mensural music: an autocorrelation approach. Paper presented at the National Meeting of the American Musicological Society, Milwaukee, WI (2014)Google Scholar
  7. 7.
    Dixon, S., Cambouropoulos, E.: Beat tracking with musical knowledge. In: Proceedings of the 14th European Conference on Artificial Intelligence, pp. 626–630. IOS Press, Amsterdam (2000)Google Scholar
  8. 8.
    Goto, M., Muraoka, Y.: Real-time beat tracking for drumless audio signals: chord change detection for musical decisions. Speech Commun. 27, 311–335 (1999)CrossRefGoogle Scholar
  9. 9.
    Temperley, D., Sleator, D.: Modeling meter and harmony: a preference-rule approach. Comput. Music J. 23(1), 10–27 (1999)CrossRefGoogle Scholar
  10. 10.
    Dixon, S.: Automatic extraction of tempo and beat from expressive performances. J. New Music Res. 30, 39–58 (2001)CrossRefGoogle Scholar
  11. 11.
    Rosenthal, D.F.: Machine rhythm: computer emulation of human rhythm perception. Doctoral dissertation. MIT, Cambridge (1992)MathSciNetCrossRefGoogle Scholar
  12. 12.
    Rosenthal, M.A., Hannon, E.H.: Cues to perceiving tonal stability: the role of temporal structure. Music Percept. 33, 601–612 (2016)CrossRefGoogle Scholar
  13. 13.
    White, C.Wm.: A corpus-sensitive algorithm for automated tonal analysis. In: Collins, T., Meredith, D., Volk, A. (eds.) MCM 2015. LNCS (LNAI), vol. 9110, pp. 115–121. Springer, Cham (2015).  https://doi.org/10.1007/978-3-319-20603-5_11CrossRefGoogle Scholar
  14. 14.
    Gouyon, F., Widmer, G., Serra, X., Flexer, A.: Acoustic cues to beat induction: a machine learning perspective. Music Percept. 24(2), 177–188 (2006)CrossRefGoogle Scholar
  15. 15.
    London, J., Himberg, T., Cross, I.: The effect of structural and performance factors in the perception of anacruses. Music Percept. 27(2), 103–120 (2009)CrossRefGoogle Scholar
  16. 16.
    Cuthbert, M.S., Ariza, C.: music21: a toolkit for computer–aided musicology and symbolic music data. In: Proceedings of the International Symposium on Music Information Retrieval, pp. 637–642 (2011)Google Scholar
  17. 17.
    Yust, J.: Organized Time: Rhythm, Tonality, and Form. Oxford University Press, New York (2018)CrossRefGoogle Scholar
  18. 18.
    Chinchor, N.: MUC-4 evaluation metrics. In: Proceedings of the Fourth Message Understanding Conference, pp. 22–29. Association for Computational Linguistics, Stroudsburg (1992)Google Scholar
  19. 19.
    Callender, C., Quinn, I., Tymoczko, D.: Generalized voice-leading spaces. Science 320, 346 (2008)MathSciNetCrossRefGoogle Scholar
  20. 20.
    Quinn, I.: What’s ‘Key for Key’: A Theoretically Naive Key–Finding Model for Bach Chorales. Zeitschrift der Gesellschaft für Musiktheorie, 7/ii, pp. 151–63 (2010)Google Scholar
  21. 21.
    Mirka, D.: Metric Manipulations in Haydn and Mozart: Chamber Music for Strings, 1787–1791. Oxford University Press, New York (2009)CrossRefGoogle Scholar
  22. 22.
    Yeston, M.: The Stratification of Musical Rhythm. Yale University Press, New Haven (1976)Google Scholar
  23. 23.
    Lerdahl, F., Jackendoff, R.: A Generative Theory of Tonal Music. MIT Press, Cambridge (1983)Google Scholar
  24. 24.
    Prince, J.B., Thompson, W.F., Schmuckler, M.A.: Pitch and time, tonality and meter: how do musical dimensions combine? J. Exp. Psychol. 35(5), 1598–1617 (2009)Google Scholar
  25. 25.
    Krebs, H.: Fantasy Pieces: Metrical Dissonance in the Music of Robert Schumann. Oxford University Press, New York (1999)CrossRefGoogle Scholar
  26. 26.
    London, J.: Hearing in Time: Psychological Aspects of Musical Meter. Oxford University Press, New York (2004)CrossRefGoogle Scholar
  27. 27.
    Cohn, R.: Complex hemiolas, ski-hill graphs and metric spaces. Music Anal. 20(3), 295–326 (2001)CrossRefGoogle Scholar
  28. 28.
    Meredith, D., Wiggins, G.A., Lemström, K.: Pattern induction and matching in polyphonic music and other multidimensional datasets. In: Proceedings of the 5th World Multi-Conference on Systemics, Cybernetics, and Informatics, vol. X, pp. 61–66 (2001)Google Scholar
  29. 29.
    Cambouropoulos, E.: The local boundary detection model (LBDM) and its application in the study of expressive timing. In: Proceedings of the International Computer Music Conference, San Francisco, pp. 17–22 (2001)Google Scholar
  30. 30.
    Pearce, M.: The construction and evaluation of statistical models of melodic structure in music perception and composition, Ph.D. thesis, School of Informatics, City University, London (2005)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.The University of Massachusetts AmherstAmherstUSA

Personalised recommendations