Advertisement

Automated Music Composition Using Heart Rate Emotion Data

  • Chih-Fang HuangEmail author
  • Yajun Cai
Conference paper
Part of the Smart Innovation, Systems and Technologies book series (SIST, volume 81)

Abstract

This paper proposes an innovated way to compose music automatically according to the input of the heartbeat sensor, to generate music with the correspondent emotion states. The typical 2D emotion plane with arousal and valence (A-V) states are adapted into our system, to determine the generative music features. Algorithmic composition technique including Markov chain is used, with the emotion - music feature mapping method, to compose the desired correspondent music. The result show a pretty good success with various generative music, including sad, happy, joyful, and angry, and the heartbeat values show its good consistency for the correspondent emotion states finally.

Keywords

2D emotion plane Arousal Valence Algorithmic composition Emotion - music feature mapping 

Notes

Acknowledgement

The authors would like to appreciate the support from Ministry of Science and Technology projects of Taiwan: MOST 105-2410-H-424-008 and MOST 105-2218-E-007-031.

References

  1. 1.
    George, P., Wigginsm, G.: AI methods for algorithmic composition: A survey, a critical view and future prospects. In: AISB Symposium on Musical Creativity, Edinburgh, UK (1999)Google Scholar
  2. 2.
    Nierhaus, G.: Algorithmic Composition: Paradigms of Automated Music Generation. Springer, New York (2009)CrossRefzbMATHGoogle Scholar
  3. 3.
    Alpern, A.: Techniques for algorithmic composition of music 95, 120 (1995). On the web http://hamp.hampshire.edu/adaF92/algocomp/algocomp
  4. 4.
    Sandred, O., Laurson, M., Kuuskankare, M.: Revisiting the Illiac Suite–a rule-based approach to stochastic processes. Sonic Ideas/Ideas Sonicas 2, 42–46 (2009)Google Scholar
  5. 5.
    Biles, J.A.: GenJam: A genetic algorithm for generating jazz solos. In: ICMC, vol. 94 (1994)Google Scholar
  6. 6.
    Dubnov, S., et al.: Using machine-learning methods for musical style modeling. Computer 36(10), 73–80 (2003)CrossRefGoogle Scholar
  7. 7.
    Boulanger-Lewandowski, N., Bengio, Y., Vincent, P.: Modeling temporal dependencies in high-dimensional sequences: Application to polyphonic music generation and transcription (2012). arXiv preprint arXiv:1206.6392
  8. 8.
    Sun, K., et al.: An improved valence-arousal emotion space for video affective content representation and recognition. In: 2009 IEEE International Conference on Multimedia and Expo ICME 2009. IEEE (2009)Google Scholar
  9. 9.
    Bustamante, P.A., et al.: Recognition and regionalization of emotions in the arousal-valence plane. In: 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC). IEEE (2015)Google Scholar
  10. 10.
    Kim, Y.E., et al.: Music emotion recognition: A state of the art review. In: Proceedings of the ISMIR. (2010)Google Scholar
  11. 11.
    Barthet, M., Fazekas, G., Sandler, M.: Multidisciplinary perspectives on music emotion recognition: Implications for content and context-based models. In: Proceedings of the CMMR, pp. 492–507 (2012)Google Scholar
  12. 12.
    Laurier, C., et al.: Exploring relationships between audio features and emotion in music. In: 7th Triennial Conference of European Society for the Cognitive Sciences of Music ESCOM 2009 (2009)Google Scholar
  13. 13.
    Gomez, P., Danuser, B.: Affective and physiological responses to environmental noises and music. Int. J. Psychophysiol. 53(2), 91–103 (2004)CrossRefGoogle Scholar
  14. 14.
    Gomez, P., Danuser, B.: Relationships between musical structure and psychophysiological measures of emotion. Emotion 7(2), 377–387 (2007). Washington DC CrossRefGoogle Scholar
  15. 15.
    Jacob, B.L.: Algorithmic composition as a model of creativity. Organised Sound 1(03), 157–165 (1996)CrossRefGoogle Scholar
  16. 16.
    Bell, C.: Algorithmic music composition using dynamic Markov chains and genetic algorithms. J. Comput. Sci. Coll. 27(2), 99–107 (2011)MathSciNetGoogle Scholar
  17. 17.
    Hoffmann, M.H.K., et al.: Noninvasive coronary angiography with 16–detector row CT: Effect of heart rate. Radiology 234(1), 86–97 (2005)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing AG 2018

Authors and Affiliations

  1. 1.Department of Information CommunicationsKainan UniversityTaoyuanTaiwan
  2. 2.Master Program of Sound and Music Innovated TechnologiesNational Chiao Tung UniversityHsinchuTaiwan, ROC

Personalised recommendations