Skip to main content

Learning to Make Feelings: Expressive Performance as a Part of a Machine Learning Tool for Sound-Based Emotion Control

  • Conference paper
From Sounds to Music and Emotions (CMMR 2012)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 7900))

Included in the following conference series:

  • 3499 Accesses

Abstract

We propose to significantly extend our work in EEG-based emotion detection for automated expressive performances of algorithmically composed music for affective communication and induction. This new system involves music composed and expressively performed in real-time to induce specific affective states, based on the detection of affective state in a human listener. Machine learning algorithms will learn: (1) how to use biosensors such as EEG to detect the user’s current emotional state; and (2) how to use algorithmic performance and composition to induce certain trajectories through affective states. In other words the system will attempt to adapt so that it can – in real-time - turn a certain user from depressed to happy, or from stressed to relaxed, or (if they like horror movies!) from relaxed to fearful. Expressive performance is key to this process as it has been shown to increase the emotional impact of affectively-based algorithmic composition. In other words if a piece is composed by computer rules to communicate an emotion of happiness, applying expressive performance rules to humanize the piece will increase the likelihood it is perceived as happy. As well as giving a project overview, a first step of this research is presented here: a machine learning system using case-based reasoning which attempts to learn from a user how themes of different affective types combine sequentially to communicate emotions.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Kirke, A., Miranda, E.R.: Combining EEG Frontal Asymmetry Studies with Affective Algorithmic Composition and Expressive Performance Models. In: Proceedings of International Computer Music Conference (ICMC 2011), Huddersfield, UK (2011)

    Google Scholar 

  2. Nasuto, S.J., Miranda, E.R.: Brain-Computer Interface for Monitoring and Inducing Affective States, EPSRC Grant EP/J002135/1 (2012)

    Google Scholar 

  3. Juslin, P., Sloboda, J.: Music and Emotion: Theory and Research. Oxford University Press (2001)

    Google Scholar 

  4. Juslin, P., Laukka, P.: Expression, perception, and induction of musical emotion: a review and a questionnaire study of everyday listening. Journal of New Music Research 33, 216–237 (2004)

    Article  Google Scholar 

  5. Minassian, C., Gayford, C., Sloboda, J.A.: Optimal experience in musical performance: A survey of young musicians. Annual General Meeting of the Society for Education, Music and Psychology Research (2003)

    Google Scholar 

  6. Goethem, A.: The Functions of Music for Affect Regulation. In: International Conference on Music and Emotion, Durham, UK (2009)

    Google Scholar 

  7. Juslin, P.: Five Facets of Musical Expression: A Psychologist’s Perspective on Music Performance. Psychology of Music 31, 273–302 (2003)

    Article  Google Scholar 

  8. Bigand, E., Vieillard, S., Madurell, F., Marozeau, J., Dacquet, A.: Multidimensional scaling of emotional responses to music: The effect of musical expertise and of the duration of the excerpts. Cognition and Emotion 19(8), 1113–1139 (2005)

    Article  Google Scholar 

  9. Zentner, M., Grandjean, D., Scherer, K.R.: Emotions evoked by the sound of music: characterization, classification, and measurement. Emotion 8(4), 494–521 (2008)

    Article  Google Scholar 

  10. Scherer, K.R.: Which Emotions Can be Induced by Music? What are the Underlying Mechanisms? And How Can we Measure Them? Journal of New Music Research 33(3), 239–251 (2004)

    Article  MathSciNet  Google Scholar 

  11. Livingstone, S., Muhlberger, R., Brown, A., Thompson, W.F.: Changing Musical Emotion: A Computational Rule System for Modifying Score and Performance. Computer Music Journal 34(1), 41–64 (2010)

    Article  Google Scholar 

  12. Schmidt, L.A., Trainor, L.J.: Frontal brain electrical activity (EEG) distinguishes valence and intensity of musical emotions. Cognition and Emotion 15(4), 487–500 (2001)

    Google Scholar 

  13. Oliveira, A.P., Cardoso, A.: Automatic Manipulation of Music to Express Desired Emotions. In: Proceedings of the 6th Sound and Music Computing Conference, Porto, Portugal, pp. 265–270 (2009)

    Google Scholar 

  14. Legaspi, R., Hashimoto, Y., Moriyama, K., Kurihara, S., Numao, M.: Music Compositional Intelligence with an Affective Flavor. In: Proceedings of the 2007 International Conference on Intelligent User Interfaces, Honolulu, Hawaii, USA, pp. 216–224 (2007)

    Google Scholar 

  15. Kirke, A., Miranda, E.R.: Guide to Computing for Expressive Music Performance. Springer, UK (2012) (in print)

    Google Scholar 

  16. Bresin, R., Friberg, A.: Emotional Coloring of Computer-Controlled Music Performances. Computer Music Journal 24(4), 44–63 (2000)

    Article  Google Scholar 

  17. Sheerani, M., Hassan, A., Jan, A., Zaka, R.: Role of Video-EEG Monitoring in the Management of Intractable Seizures and Non-epileptic Spells. Pakistan Journal of Neurological Sciences 2(4), 207–209 (2007)

    Google Scholar 

  18. Silberman, E.K., Weingarter, H.: Hemispheric lateralization of functions related to emotion. Brain and Cognition 5(3), 322–353 (1986)

    Article  Google Scholar 

  19. Allen, J.J.B., Kline, J.P.: Frontal EEG asymmetry, emotion, and psychopathology: the first, and the next 25 years. Biological Psychology 67(1), 1–5 (2004)

    Article  Google Scholar 

  20. Wyczesany, M., Kaiser, J., Coenen, A.M.L.: Subjective mood estimation co-varies with spectral power EEG characteristics. Acta Neurobiologiae Experimentalis 68(2), 180–192 (2008)

    Google Scholar 

  21. Davidson, R.J.: The neuropsychology of emotion and affective style. In: Lewis, M., Haviland, J.M. (eds.) Handbook of Emotion. Guilford Press (1993)

    Google Scholar 

  22. Musha, T., Terasaki, Y., Haque, H.A., Ivamitsky, G.A.: Feature extraction from EEGs associated with emotions. Art. Life Robotics 1(1), 15–19 (1997)

    Article  Google Scholar 

  23. Bos, D.O.: EEG-based Emotion Recognition: The Influence of Visual and Auditory Stimuli (2007), http://hmi.ewi.utwente.nl/verslagen/capita-selecta/CS-Oude_Bos-Danny.pdf

  24. Fries, P.: A mechanism for cognitive dynamics: neuronal communication through neuronal coherence. Trends in Cognitive Sciences 9(10), 474–480 (2005)

    Article  Google Scholar 

  25. Sauseng, P., Klimesch, W., Doppelmayr, M., Pecherstorfer, T., Freunberger, R., Hanslmayr, S.: EEG alpha synchronization and functional coupling during top-down processing in a working memory task. Hum. Brain Map. 26(2), 148–155 (2005)

    Article  Google Scholar 

  26. Sweeney-Reed, C.M., Nasuto, S.J.: A novel approach to the detection of synchronization in EEG based on empirical mode decomposition. Journal of Computational Neuroscience 23(1), 79–111 (2007)

    Article  Google Scholar 

  27. Bhattacharya, J., Petsche, H., Pereda, E.: Long-range synchrony in the gamma band: role in music perception. Journal of Neuroscience 21, 6329–6337 (2001)

    Google Scholar 

  28. Hu, M., Li, J., Li, G., Tang, X., Freeman, W.J.: Normal and Hypoxia EEG Recognition Based on a Chaotic Olfactory Model. In: Wang, J., Yi, Z., Żurada, J.M., Lu, B.-L., Yin, H. (eds.) ISNN 2006. LNCS, vol. 3973, pp. 554–559. Springer, Heidelberg (2006)

    Chapter  Google Scholar 

  29. Kirke, A., Miranda, E.R.: An Instance Based Model for Generating Expressive Performance During Composition. In: Proceedings of International Computer Music Conference (ICMC 2008), Belfast, UK (2008)

    Google Scholar 

  30. Kirke, A., Miranda, E.R.: Artificial Social Composition: A Multi-Agent System for Composing Music Performances by Emotional Communication. In: Klouche, T. (ed.) Mathematical and Computational Musicology. Springer (2010)

    Google Scholar 

  31. Guez, A., Vincent, R.D., Avoli, M., Pineau, J.: Adaptive Treatment of Epilepsy via Batch-mode Reinforcement Learning. In: Proceedings of the 20th Innovative Applications of Artificial Intelligence Conference, pp. 1671–1678 (2008)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2013 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Kirke, A., Miranda, E.R., Nasuto, S.J. (2013). Learning to Make Feelings: Expressive Performance as a Part of a Machine Learning Tool for Sound-Based Emotion Control. In: Aramaki, M., Barthet, M., Kronland-Martinet, R., Ystad, S. (eds) From Sounds to Music and Emotions. CMMR 2012. Lecture Notes in Computer Science, vol 7900. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-41248-6_29

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-41248-6_29

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-41247-9

  • Online ISBN: 978-3-642-41248-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics