Abstract
Advances in multimedia and information systems have shifted the focus from general content repositories towards personalized systems. Much effort has been put into modeling and integration of affective states with the purpose of improving overall user experience and functionality of the system. In this chapter, we present a multi-modal dataset of users’ emotional and visual (color) responses to music, with accompanying personal and demographic profiles, which may serve as the knowledge basis for such improvement. Results show that emotional mediation of users’ perceptive states can significantly improve user experience in terms of context-dependent personalization in multimedia and information systems.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Juslin, P. N., & Västfjäll, D. (2008). Emotional responses to music: The need to consider underlying mechanisms. Behavioral and Brain Sciences, 31(5), 559–575.
Kim, Y. E., Schmidt, E. M., Migneco, R., Morton, B. G., Richardson, P., Scott, J., et al. (2010). Music emotion recognition: A state of the art review. In Proceedings of the international conference on music information retrieval (ISMIR) (pp. 255–266), Utrecht.
Laurier, C., Meyers, O., Serrà, J., Blech, M., Herrera, P., & Serra, X. (2009). Indexing music by mood: design and integration of an automatic content-based annotator. Multimedia Tools and Applications, 48(1), 161–184.
Song, Y., Dixon, S., & Pearce, M. (2012). A survey of music recommendation systems and future perspectives. In Proceedings of the 9th international symposium on computer music modelling and retrieval (CMMR) (pp. 395–410), London.
Russell, J. A. (1980). A circumplex model of affect. Journal of Personality and Social Psychology, 39(6), 1161–1178.
Barthet, M., Marston, D., Baume, C., Fazekas, G., & Sandler, M. (2013). Design and evaluation of semantic mood models for music recommendation using editorial tags. In Proceedings of the international conference on music information retrieval (ISMIR), Curitiba.
Laurier, C., Sordo, M., Serrà, J., & Herrera, P. (2009). Music mood representations from social tags. In Proceedings of the international conference on music information retrieval (ISMIR) (pp. 381–386).
Mcvicar, M., Freeman, T., & De Bie, T. (2011). Mining the correlation between lyrical and audio features and the emergence of mood. In Proceedings of the international conference on music information retrieval (ISMIR) (pp. 783–788), Miami.
Yang, Y.-H., & Chen, H. H. (2012). Machine recognition of music emotion.
Eerola, T., & Vuoskoski, J. K. (2010). A comparison of the discrete and dimensional models of emotion in music. Psychology of Music, 39(1), 18–49.
Schmidt, E. M., & Kim, Y. E. (2011). Modeling musical emotion dynamics with conditional random fields. In ISMIR (pp. 777–782).
Turnbull, D., Barrington, L., Torres, D., & Lanckriet, G. (2008). Semantic annotation and retrieval of music and sound effects. IEEE Transactions on Audio, Speech, and Language Processing, 16(2), 467–476.
Schuller, B., Hage, C., Schuller, D., & Rigoll, G. (2010). ‘Mister DJ, Cheer Me Up!’: Musical and textual features for automatic mood classification. Journal of New Music Research, 39(1), 13–34.
Aljanaki, A., Bountouridis, D., Burgoyne, J. A., van Balen, J., Wiering, F., Honing, H., & Veltkamp, R. C. (2014). Designing games with a purpose for data collection in music research. Emotify and Hooked: Two case studies. Lecture Notes in Computer Science.
Hu, X., & Downie, J. S. (2007). Exploring mood metadata: relationships with genre, artist and usage metadata. In Proceedings of the international conference on music information retrieval (ISMIR), Vienna.
Schedl, M., Flexer, A., & Urbano, J. (2013). The neglected user in music information retrieval research. Journal of Intelligent Information Systems, 41(3), 523–539.
Donaldson, J., & Lamere, P. (2009). Using visualizations for music discovery. In Proceedings of the international conference on music information retrieval (ISMIR), Tutorial.
Grohganz, H., Clausen, M., Jiang, N., & Mueller, M. (2013). Converting path structures into block structures using eigenvalue decompositions of self-similarity matrices. In Proceedings of the international conference on music information retrieval (ISMIR), Curitiba.
Isaacson, E. (2005). What you see is what you get: On visualizing music. In Proceedings of the international conference on music information retrieval (ISMIR) (pp. 389–395), London.
Jiang, N., & Mueller, M. (2013). Automated methods for analyzing music recordings in sonata form. In Proceedings of the international conference on music information retrieval (ISMIR), Curitiba.
Mardirossian, A., & Chew, E. (2007). Visualizing music: Tonal progressions and distributions. In Proceedings of the international conference on music information retrieval (ISMIR) (pp. 189–194), Vienna.
Yoshii, K., & Goto, M. (2008). Music thumbnailer: Visualizing musical pieces in thumbnail images based on acoustic features. In Proceedings of the international conference on music information retrieval (ISMIR) (pp. 211–216), Philadelphia.
Torrens, M., Hertzog, P., & Arcos, J. L. (2004). Visualizing and exploring personal music libraries. In Proceedings of the international conference on music information retrieval (ISMIR), Barcelona.
Van Gulik, R., & Vignoli, F. (2005). Visual playlist generation on the artist map. In Proceedings of the international conference on music information retrieval (ISMIR), London.
Van Gulik, R., Vignoli, F., & Van de Wetering, H. (2004). Mapping music in the palm of your hand, explore and discover your collection. In Proceedings of the international conference on music information retrieval (ISMIR), Barcelona.
Julia, C. F., & Jorda, S. (2009). SongExplorer: A tabletop application for exploring large collections of songs. In Proceedings of the international conference on music information retrieval (ISMIR) (pp. 675–680), Kobe.
Lamere, P., & Eck, D. (2007). Using 3D visualizations to explore and discover music. In Proceedings of the international conference on music information retrieval (ISMIR) (pp. 173–174).
Lugmayr, A. (2013). Brief introduction into information systems & management research in media industries. In 2013 IEEE international conference on multimedia and expo workshops (ICMEW) (pp. 1–6). IEEE.
Lugmayr, A., Risse, T., Stockleben, B., Laurila, K., & Kaario, J. (2009). Semantic ambient media—An introduction. Multimedia Tools and Applications, 44(3), 337–359.
Bachmayer, S., Lugmayr, A., & Kotsis, G. (2010). Convergence of collaborative web approaches and interactive TV program formats. International Journal of Web Information Systems, 6(1), 74–94.
Jose, P. T., Miglani, S., & Yadav, S. (2014). Human computer interaction: Analysis and journey through eras. International Journal of Computer Science and Mobile Computing, 3(4), 653–650.
Pantic, M., Nijholt, A., Pentland, A., & Huanag, T. S. (2008). Human-centred intelligent human? Computer interaction (HCI2): How far are we from attaining it? International Journal of Autonomous and Adaptive Communications Systems, 1(2), 168–187.
Picard, R. W. (2000). Toward computers that recognize and respond to user emotion. IBM Systems Journal, 39, 705–719.
Tao, J., & Tan, T. (2005). Affective computing: A review (Vol. 1). Berlin, Heidelberg: Springer.
Ekman, P. (1992). An argument for basic emotions. Cognition and Emotion, 6, 169–200.
Pesek, M., Godec, P., Poredos, M., Strle, G., Guna, J., Stojmenova, E., et al. (2014). Introducing a dataset of emotional and color responses to music. In Proceedings of the international conference on music information retrieval (ISMIR) (pp. 355–360), Taipei.
Albert, W., & Tullis, T. (2013). Measuring the user experience: Collecting, analyzing, and presenting usability metrics (Google eBook). Newnes.
Hart, S. G. (2006). Nasa-task load index (NASA-TLX); 20 years later. Proceedings of the Human Factors and Ergonomics Society Annual Meeting, 50(9), 904–908.
Griscom, W. S., & Palmer, S. E. (2012). The color of musical sounds: Color associates of harmony and timbre in non-synesthetes. Journal of Vision, 12(9), 74–74.
Palmer, S. E., Schloss, K. B., Zoe, X., & Prado-León, L. R. (2013). Music-color associations are mediated by emotion. Proceedings of the National Academy of Sciences, 110(22), 8836–8841.
Juslin, P. N., & Sloboda, J. A. (2001). Music and emotion: Theory and research. Oxford University Press.
Ou, L.-C., Luo, M. R., Woodcock, A., & Wright, A. (2004). A study of colour emotion and colour preference. Part I: Colour emotions for single colours. Color Research & Application, 29(3), 232–240.
Schmidt, E. M., & Kim, Y. E. (2009). Projection of acoustic features to continuous valence-arousal mood labels via regression. In 10th international society for music information retrieval conference. ISMIR.
Pesek, M., Leonardis, A., & Marolt, M. (2014). A compositional hierarchical model for music information retrieval. In Proceedings of the international conference on music information retrieval (ISMIR) (pp. 131–136), Taipei.
Abdi, H. (2007). The method of least squares. Encyclopedia of measurement and statistics. CA, USA: Thousand Oaks.
Moon, C. B., Kim, L., Lee, H. A., & Kim, B. M. (2013). Analysis of relationships between mood and color for different musical preferences. Color Research & Application, 39(4), 413–423.
Peter, C., & Beale, R. (2008). Affect and emotion in human-computer interaction: From theory to applications. Springer.
Kaminskas, M., & Ricci, F. (2012). Contextual music information retrieval and recommendation: State of the art and challenges. Computer Science Review, 6(2–3), 89–119.
Sodnik, J., Jakus, G., & Tomažič, S. (2011). Multiple spatial sounds in hierarchical menu navigation for visually impaired computer users. International Journal of Human-Computer Studies, 69(1–2), 100–112.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2016 Springer International Publishing AG
About this chapter
Cite this chapter
Pesek, M., Strle, G., Guna, J., Stojmenova, E., Pogačnik, M., Marolt, M. (2016). Towards a Personalised and Context-Dependent User Experience in Multimedia and Information Systems. In: Lugmayr, A., Stojmenova, E., Stanoevska, K., Wellington, R. (eds) Information Systems and Management in Media and Entertainment Industries. International Series on Computer Entertainment and Media Technology. Springer, Cham. https://doi.org/10.1007/978-3-319-49407-4_8
Download citation
DOI: https://doi.org/10.1007/978-3-319-49407-4_8
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-49405-0
Online ISBN: 978-3-319-49407-4
eBook Packages: Computer ScienceComputer Science (R0)