Advertisement

Visualization of Musical Emotions by Colors of Images

  • Dao Nam AnhEmail author
Chapter
  • 7 Downloads
Part of the Studies in Computational Intelligence book series (SCI, volume 899)

Abstract

Visualizing musical sound for content expression is very efficient application which allows presenting music in all its various facets. This article explores the significance of musical emotion anticipating image emotion features. This is a novel representation of the music data to show how the emotion features can add value to a set of existing sound aspects. The musical emotions were then represented by a filter with support of the Gaussian distribution to be used as a color balance filter diversified in term of musical features. With this filter, a color adjustment model can use RGB color system to modify color channels to produce color transform for image regions which are associated with the original musical emotions. As the transform filter is based on music emotions, the image has its color changed adaptively by the emotions. The visualizing solution is then performed in experiments with a music database and image dataset to evaluate the performance. The experiments show the productive visual effect of emotion taking place in the music database with a wide range of instruments and styles and should be of interest for applications of mapping the music and the visual data.

References

  1. 1.
    Peacock, K.: Instruments to perform color-music: two centuries of technological experimentation. Leonardo 21(4), 397–406 (1988)CrossRefGoogle Scholar
  2. 2.
    Langer, T.: Music information retrieval and visualization. Trends in Information Visualization (2010)Google Scholar
  3. 3.
    Robyn, R.T., Torres, D.: Real-time music visualization using responsive imagery (2007)Google Scholar
  4. 4.
    Wang, G., Cook, P.R.: Audicle: a context sensitive, on-the-fly audio programming Envi-ron/mentality. In: Proceedings of the Inter Computer Music Conference, pp. 256–263 (2004)Google Scholar
  5. 5.
    Chitanont, N., Yaginuma, K., Yatabe, K., Oikawa, Y.: Visualization of sound field by means of schlieren method with spatio-temporal filtering. In: ICASSP (2015)Google Scholar
  6. 6.
    Outram, B.I.: Synesthesia audio visual interactive sound and music visualization in virtual reality with orbital observation and navigation. In: IEEE Inter Workshop on Mixed Reality Art (MRA) (2016)Google Scholar
  7. 7.
    Nanayakkara, S.C., Taylor, E., Wyse, L., Ong, S.H.: Towards building an experiential music visualizer. In: IEEE ICICS (2007)Google Scholar
  8. 8.
    Herremans, D., Chuan, C.H.: A multi-modal platform for semantic music analysis: visualizing audio and score-based tension. In: IEEE ICSC (2017)Google Scholar
  9. 9.
    Klemenc, B., Ciuha, P., Subelj, L., Bajec, M.: Visual and aural: visualization of harmony in music with colour. IPSI BgD Trans. Internet Res. 7(1), 48–53 (2011)Google Scholar
  10. 10.
    Farbood, M., Pasztor, E., Jennings, K.: Hyperscore: a graphical sketchpad for novice composers. Comput. Graph. Appl. 50–54 (2004)Google Scholar
  11. 11.
    Yoshii, K., Goto, M.: Music thumbnailer: visualizing musical pieces in thumbnail images based on acoustic features. In: Proceedings of ISMIR (2008)Google Scholar
  12. 12.
    Sedes, A., Courribet, B., Thiebaut, J.-B.: From the visualization of sound to real-time sonification: different prototypes in the Max/MSP/Jitter environment. In: Proceedings of ICMC (2004)Google Scholar
  13. 13.
    Chen, C.H., Weng, M.F., Jeng, S.K., Chuang, Y.Y.: Emotion-based music visualization using photos. In: MMM2008, pp. 358–368 (2008)Google Scholar
  14. 14.
    Barber, D.: Bayesian Reasoning and Machine Learning. Cambridge University Press, New York (2012)zbMATHGoogle Scholar
  15. 15.
    Laurier, C., Lartillot, O., Eerola, T., Toiviainen, P.: Exploring relationships between audio features and emotion in music. In: Proceedings of Conference ESCOM (2009)Google Scholar
  16. 16.
    Sutton, T.M., Altarriba, J.: Color associations to emotion and emotion-laden words: a collection of norms for stimulus construction and selection. J. Behav. Res. 48, 686 (2016)CrossRefGoogle Scholar
  17. 17.
    Guy, K.K.: Colour Constancy using von Kries, Transformations - Colour Constancy “goes to the Lab”. Res. Lett. Inf. Math. Sci. 13, 19–33 (2009)Google Scholar
  18. 18.
    Dao, A.N.: Partial ellipse filter for maximizing region similarity for noise removal and color regulation. Multi-disciplinary Trends in Artificial Intelligence, MIWAI, Lecture Notes in Computer Science, vol. 11248. Springer (2018)Google Scholar
  19. 19.
    Iloga, S., Romain, O., Tchuente, M.: An accurate HMM-based similarity measure between finite sets of histograms. In: Pattern Analysis and Applications, pp. 1–26. Springer, London (2018)Google Scholar
  20. 20.
    Yan, Q., Xu, L., Shi, J., Jia, J.: Hierarchical saliency detection. In: CVPR (2013)Google Scholar
  21. 21.
    Richardson, I.E.G.: H.264 and MPEG-4 Video compression: video coding for next-generation multimedia. Wiley, Chichester (2003)Google Scholar
  22. 22.
    Sogaard, J., Krasula, L., Shahid, M., Temel, D., Brunnstrom, K., Razaak, M.: Aplicability of existing objective metrics of perceptual quality for adaptive video streaming. Electron. Imaging, (13), 1–7 (2016)Google Scholar
  23. 23.
    Bermejo, S., Cabestany, J.: Oriented principal component analysis for large margin classifiers. Neural Networks 14(10), 1447–1461 (2001)CrossRefGoogle Scholar
  24. 24.
    Ponomarenko, N., Ieremeiev, O., Lukin, V., Egiazarian, K., Carli, M.: Modified image visual quality metrics for contrast change and mean shift accounting. In: 2011 11th International Conference the Experience of Designing and Application of CAD Systems in Microelectronics (CADSM), pp. 305–311 (2011)Google Scholar

Copyright information

© The Editor(s) (if applicable) and The Author(s), under exclusive license to Springer Nature Switzerland AG 2021

Authors and Affiliations

  1. 1.Faculty of Information TechnologyElectric Power UniversityHanoiVietnam

Personalised recommendations