Filter-Based Approach for Ornamentation Detection and Recognition in Singing Folk Music
Ornamentations in music play a significant role for the emotion whi1ch a performer or a composer aims to create. The automated identification of ornamentations enhances the understanding of music, which can be used as a feature for tasks such as performer identification or mood classification. Existing methods rely on a pre-processing step that performs note segmentation. We propose an alternative method by adapting the existing two-dimensional COSFIRE filter approach to one-dimension (1D) for the automatic identification of ornamentations in monophonic folk songs. We construct a set of 1D COSFIRE filters that are selective for the 12 notes of the Western music theory. The response of a 1D COSFIRE filter is computed as the geometric mean of the differences between the fundamental frequency values in a local neighbourhood and the preferred values at the corresponding positions. We apply the proposed 1D COSFIRE filters to the pitch tracks of a song at every position along the entire signal, which in turn give response values in the range [0,1]. The 1D COSFIRE filters that we propose are effective to recognize meaningful musical information which can be transformed into symbolic representations and used for further analysis. We demonstrate the effectiveness of the proposed methodology in a new data set that we introduce, which comprises five monophonic Cypriot folk tunes consisting of 428 ornamentations. The proposed method is effective for the detection and recognition of ornamentations in singing folk music.
KeywordsSignal processing Folk music analysis Computational ethnomusicology Performer classification Mood classification Ornamentation detection Ornamentation recognition COSFIRE
Unable to display preview. Download preview PDF.
- 1.Taylor, E.: The AB guide to music. The Associate Board of the Royal Schools of Music (Publishing), London (1989)Google Scholar
- 3.Ramrez, R., Prez A., Kersten S., Maestre E.: Performer identification in celtic violin recordings. In: International Conference on Music Information Retrieval (2008)Google Scholar
- 4.Ramrez, R., Maestre E.: A framework for performer identification in audio recordings. In: International Workshop on Machine Learning and Music - ECML-PKDD (2009)Google Scholar
- 5.Boenn, G.: Automated quantisation and transcription of musical ornaments from audio recordings. In: Proc. of the Int. Computer Music Conf. (ICMC), pp. 236–239 (2007)Google Scholar
- 6.Casey, M., Crawford, T.: Automatic location and measurement of ornaments in audio recording. In: Proc. of the 5th Int. Conf. on Music Information Retrieval (ISMIR), pp. 311–317 (2004)Google Scholar
- 7.Gainza, M., Coyle, E.: Automating ornamentation transcription. In: IEEE Int. Conf. on Acoustics, Speech and Signal Processing (2007)Google Scholar
- 8.Kokuer, M., Jancovic, P., Ali-MacLachlan, I., Athwal, C.: Automated detection of single - and multi-note ornaments in irish traditional flute playing. In: 15th International Society for Music Information Retrieval Conference (ISMIR) (2014)Google Scholar
- 11.Panteli M.: Pitch Patterns of Cypriot Folk Music between Byzantine and Ottoman Influence. Master thesis. University of Pompeu Fabra (2011)Google Scholar