Skip to main content

Key Estimation in Electronic Dance Music

  • Conference paper
Advances in Information Retrieval (ECIR 2016)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 9626))

Included in the following conference series:

Abstract

In this paper we study key estimation in electronic dance music, an umbrella term referring to a variety of electronic music subgenres intended for dancing at nightclubs and raves. We start by defining notions of tonality and key before outlining the basic architecture of a template-based key estimation method. Then, we report on the tonal characteristics of electronic dance music, in order to infer possible modifications of the method described. We create new key profiles combining these observations with corpus analysis, and add two pre-processing stages to the basic algorithm. We conclude by comparing our profiles to existing ones, and testing our modifications on independent datasets of pop and electronic dance music, observing interesting improvements in the performance or our algorithms, and suggesting paths for future research.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    We take this term from Tagg [26] to refer to European Classical Music of the so-called common practice repertoire, on which most treatises on harmony are based.

  2. 2.

    The Music Information Retrieval Evaluation eXchange (MIREX) is an international committee born to evaluate advances in Music Information Retrieval among different research centres, by quantitatively comparing algorithm performance using test sets that are not available beforehand to participants.

  3. 3.

    http://www.ibrahimshaath.co.uk/keyfinder/KeyFinderV2Dataset.pdf.

  4. 4.

    http://blog.dubspot.com/dubspot-lab-report-mixed-in-key-vs-beatport

    http://www.djtechtools.com/2014/01/14/key-detection-software-comparison-2014-edition.

  5. 5.

    https://pro.beatport.com/.

  6. 6.

    http://essentia.upf.edu/.

  7. 7.

    After informal testing, we decided to use the following settings in all the experiments reported: mix-down to mono; sampling rate: 44,100 Hz.; window size: 4,096 hanning; hop size: 16,384; frequency range: 25–3,500 Hz.; PCP size: 36 bins; weighting size: 1 semitone; similarity: cosine distance.

  8. 8.

    http://www.ibrahimshaath.co.uk/keyfinder/.

  9. 9.

    http://isophonics.net/QMVampPlugins.

References

  1. Bogdanov, D., Wack, N., Gómez, E., Gulati, S., Herrera, P., Mayor, O.: ESSENTIA: an open-source library for sound and music analysis. In: Proceedings 21st ACM-ICM, pp. 855–858 (2013)

    Google Scholar 

  2. Cannam, C., Mauch, M., Davies, M.: MIREX 2013 Entry: Vamp plugins from the Centre For Digital Music (2013). www.music-ir.org

  3. Everett, W.: Making sense of rock’s tonal systems. Music Theory Online, vol. 10(4) (2004)

    Google Scholar 

  4. Dayal, G., Ferrigno, E.: Electronic Dance Music. Grove Music Online. Oxford University Press, Oxford (2012)

    Google Scholar 

  5. Dressler, K., Streich, S.: Tuning frequency estimation using circular statistics. In: Proceedings of the 8th ISMIR, pp. 2–5 (2007)

    Google Scholar 

  6. Gómez, E.: Tonal description of polyphonic audio for music content processing. INFORMS J. Comput. 18(3), 294–304 (2006)

    Article  Google Scholar 

  7. Gómez, E.: Tonal description of music audio signals. Ph.D. thesis, Universitat Pompeu Fabra, Barcelona (2006)

    Google Scholar 

  8. Harte., C.: Towards automatic extraction of harmony information from music signals. Ph.D. thesis, Queen Mary University of London (2010)

    Google Scholar 

  9. Hyer, B.: Tonality. Grove Music Online. Oxford University Press, Oxford (2012)

    Google Scholar 

  10. James, R.: My life would suck without you / Where have you been all my life: Tension-and-release structures in tonal rock and non-tonal EDM pop. www.its-her-factory.com/2012/07/my-life-would-suck-without-youwhere-have-you-been-all-my-life-tension-and-release-structures-in-tonal-rock-and-non-tonal-edm-pop. Accessed 16th December 2014

  11. Klapuri, A.: Multipitch analysis of polyphonic music and speech signals using an auditory model. IEEE Trans. Audio Speech Lang. Process. 16(2), 255–266 (2008)

    Article  Google Scholar 

  12. Knees, P., Faraldo, Á., Herrera, P., Vogl, R., Böck, S., Hörschläger, F., Le Goff, M.: Two data sets for tempo estimation and key detection in electronic dance music annotated from user corrections. In: Proceeings of the 16th ISMIR (2015)

    Google Scholar 

  13. Krumhansl, C.L.: Cognitive Foundations of Musical Pitch. Oxford Unversity Press, New York (1990)

    Google Scholar 

  14. Mauch, M., Dixon., S.: Approximate note transcription for the improvedidentification of difficult chords. In: Proceedings of the 11th ISMIR, pp. 135–140 (2010)

    Google Scholar 

  15. Mauch, M., Cannam, C., Davies, M., Dixon, S., Harte, C., Kolozali, S., Tidjar, D.: OMRAS2 metadata project 2009. In: Proceedings of the 10th ISMIR, Late-Breaking Session (2009)

    Google Scholar 

  16. Moore, A.: The so-called “flattened seventh” in rock. Pop. Music 14(2), 185–201 (1995)

    Article  Google Scholar 

  17. Müller, M., Ewert, S.: Towards timbre-invariant audio features for harmony-based music. IEEE Trans. Audio Speech Lang. Process. 18(3), 649–662 (2010)

    Article  Google Scholar 

  18. Noland, K.: Computational Tonality estimation: Signal Processing and Hidden Markov Models. Ph.D. thesis, Queen Mary University of London (2009)

    Google Scholar 

  19. Noland, K., Sandler, M.: Signal processing parameters for tonality estimation. In: Proceedings of the 122nd Convention Audio Engeneering Society (2007)

    Google Scholar 

  20. Pollack., A.W.: Notes on..series. (Accessed: 1 February 2015). www.icce.rug.nl/ soundscapes/DATABASES/AWP/awp-notes_on.shtml

  21. Saslaw, J.: Modulation (i). Grove Music Online. Oxford University Press, Oxford (2012)

    Google Scholar 

  22. Schellenberg, E.G., von Scheve, C.: Emotional cues in American popular music: five decades of the Top 40. Psychol. Aesthetics Creativity Arts 6(3), 196–203 (2012)

    Article  Google Scholar 

  23. Sha’ath., I.: Estimation of key in digital music recordings. In: Departments of Computer Science & Information Systems, Birkbeck College, University of London (2011)

    Google Scholar 

  24. Spicer, M.: (Ac)cumulative form in pop-rock music. Twentieth Century Music 1(1), 29–64 (2004)

    Article  Google Scholar 

  25. Tagg, P.: From refrain to rave: the decline of figure and raise of ground. Pop. Music 13(2), 209–222 (1994)

    Article  Google Scholar 

  26. Tagg., P.: Everyday tonality II (Towards a tonal theory of what most people hear). The Mass Media Music Scholars’ Press. New York and Huddersfield (2014)

    Google Scholar 

  27. Temperley, D.: What’s key for key? The Krumhansl-Schmuckler key-finding algorithm reconsidered. Music Percept. Interdiscip. J. 17(1), 65–100 (1999)

    Article  Google Scholar 

  28. Röbel, A., Rodet, X.: Efficient spectral envelope estimation and its application to pitch shifting and envelope preservation. In: Proceedings of the 8th DAFX (2005)

    Google Scholar 

  29. Zhu, Y., Kankanhalli, M.S., Gao., S.: Music key detection for musical audio. In: Proceedings of the 11th IMMC, pp. 30–37 (2005)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ángel Faraldo .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2016 Springer International Publishing Switzerland

About this paper

Cite this paper

Faraldo, Á., Gómez, E., Jordà, S., Herrera, P. (2016). Key Estimation in Electronic Dance Music. In: Ferro, N., et al. Advances in Information Retrieval. ECIR 2016. Lecture Notes in Computer Science(), vol 9626. Springer, Cham. https://doi.org/10.1007/978-3-319-30671-1_25

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-30671-1_25

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-30670-4

  • Online ISBN: 978-3-319-30671-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics