Skip to main content

Towards a Personalised and Context-Dependent User Experience in Multimedia and Information Systems

  • Chapter
  • First Online:
Information Systems and Management in Media and Entertainment Industries

Abstract

Advances in multimedia and information systems have shifted the focus from general content repositories towards personalized systems. Much effort has been put into modeling and integration of affective states with the purpose of improving overall user experience and functionality of the system. In this chapter, we present a multi-modal dataset of users’ emotional and visual (color) responses to music, with accompanying personal and demographic profiles, which may serve as the knowledge basis for such improvement. Results show that emotional mediation of users’ perceptive states can significantly improve user experience in terms of context-dependent personalization in multimedia and information systems.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 119.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 159.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 159.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    http://www.music-ir.org/mirex/wiki/2014:GC14UX.

References

  1. Juslin, P. N., & Västfjäll, D. (2008). Emotional responses to music: The need to consider underlying mechanisms. Behavioral and Brain Sciences, 31(5), 559–575.

    Google Scholar 

  2. Kim, Y. E., Schmidt, E. M., Migneco, R., Morton, B. G., Richardson, P., Scott, J., et al. (2010). Music emotion recognition: A state of the art review. In Proceedings of the international conference on music information retrieval (ISMIR) (pp. 255–266), Utrecht.

    Google Scholar 

  3. Laurier, C., Meyers, O., Serrà, J., Blech, M., Herrera, P., & Serra, X. (2009). Indexing music by mood: design and integration of an automatic content-based annotator. Multimedia Tools and Applications, 48(1), 161–184.

    Article  Google Scholar 

  4. Song, Y., Dixon, S., & Pearce, M. (2012). A survey of music recommendation systems and future perspectives. In Proceedings of the 9th international symposium on computer music modelling and retrieval (CMMR) (pp. 395–410), London.

    Google Scholar 

  5. Russell, J. A. (1980). A circumplex model of affect. Journal of Personality and Social Psychology, 39(6), 1161–1178.

    Google Scholar 

  6. Barthet, M., Marston, D., Baume, C., Fazekas, G., & Sandler, M. (2013). Design and evaluation of semantic mood models for music recommendation using editorial tags. In Proceedings of the international conference on music information retrieval (ISMIR), Curitiba.

    Google Scholar 

  7. Laurier, C., Sordo, M., Serrà, J., & Herrera, P. (2009). Music mood representations from social tags. In Proceedings of the international conference on music information retrieval (ISMIR) (pp. 381–386).

    Google Scholar 

  8. Mcvicar, M., Freeman, T., & De Bie, T. (2011). Mining the correlation between lyrical and audio features and the emergence of mood. In Proceedings of the international conference on music information retrieval (ISMIR) (pp. 783–788), Miami.

    Google Scholar 

  9. Yang, Y.-H., & Chen, H. H. (2012). Machine recognition of music emotion.

    Google Scholar 

  10. Eerola, T., & Vuoskoski, J. K. (2010). A comparison of the discrete and dimensional models of emotion in music. Psychology of Music, 39(1), 18–49.

    Article  Google Scholar 

  11. Schmidt, E. M., & Kim, Y. E. (2011). Modeling musical emotion dynamics with conditional random fields. In ISMIR (pp. 777–782).

    Google Scholar 

  12. Turnbull, D., Barrington, L., Torres, D., & Lanckriet, G. (2008). Semantic annotation and retrieval of music and sound effects. IEEE Transactions on Audio, Speech, and Language Processing, 16(2), 467–476.

    Article  Google Scholar 

  13. Schuller, B., Hage, C., Schuller, D., & Rigoll, G. (2010). ‘Mister DJ, Cheer Me Up!’: Musical and textual features for automatic mood classification. Journal of New Music Research, 39(1), 13–34.

    Article  Google Scholar 

  14. Aljanaki, A., Bountouridis, D., Burgoyne, J. A., van Balen, J., Wiering, F., Honing, H., & Veltkamp, R. C. (2014). Designing games with a purpose for data collection in music research. Emotify and Hooked: Two case studies. Lecture Notes in Computer Science.

    Google Scholar 

  15. Hu, X., & Downie, J. S. (2007). Exploring mood metadata: relationships with genre, artist and usage metadata. In Proceedings of the international conference on music information retrieval (ISMIR), Vienna.

    Google Scholar 

  16. Schedl, M., Flexer, A., & Urbano, J. (2013). The neglected user in music information retrieval research. Journal of Intelligent Information Systems, 41(3), 523–539.

    Article  Google Scholar 

  17. Donaldson, J., & Lamere, P. (2009). Using visualizations for music discovery. In Proceedings of the international conference on music information retrieval (ISMIR), Tutorial.

    Google Scholar 

  18. Grohganz, H., Clausen, M., Jiang, N., & Mueller, M. (2013). Converting path structures into block structures using eigenvalue decompositions of self-similarity matrices. In Proceedings of the international conference on music information retrieval (ISMIR), Curitiba.

    Google Scholar 

  19. Isaacson, E. (2005). What you see is what you get: On visualizing music. In Proceedings of the international conference on music information retrieval (ISMIR) (pp. 389–395), London.

    Google Scholar 

  20. Jiang, N., & Mueller, M. (2013). Automated methods for analyzing music recordings in sonata form. In Proceedings of the international conference on music information retrieval (ISMIR), Curitiba.

    Google Scholar 

  21. Mardirossian, A., & Chew, E. (2007). Visualizing music: Tonal progressions and distributions. In Proceedings of the international conference on music information retrieval (ISMIR) (pp. 189–194), Vienna.

    Google Scholar 

  22. Yoshii, K., & Goto, M. (2008). Music thumbnailer: Visualizing musical pieces in thumbnail images based on acoustic features. In Proceedings of the international conference on music information retrieval (ISMIR) (pp. 211–216), Philadelphia.

    Google Scholar 

  23. Torrens, M., Hertzog, P., & Arcos, J. L. (2004). Visualizing and exploring personal music libraries. In Proceedings of the international conference on music information retrieval (ISMIR), Barcelona.

    Google Scholar 

  24. Van Gulik, R., & Vignoli, F. (2005). Visual playlist generation on the artist map. In Proceedings of the international conference on music information retrieval (ISMIR), London.

    Google Scholar 

  25. Van Gulik, R., Vignoli, F., & Van de Wetering, H. (2004). Mapping music in the palm of your hand, explore and discover your collection. In Proceedings of the international conference on music information retrieval (ISMIR), Barcelona.

    Google Scholar 

  26. Julia, C. F., & Jorda, S. (2009). SongExplorer: A tabletop application for exploring large collections of songs. In Proceedings of the international conference on music information retrieval (ISMIR) (pp. 675–680), Kobe.

    Google Scholar 

  27. Lamere, P., & Eck, D. (2007). Using 3D visualizations to explore and discover music. In Proceedings of the international conference on music information retrieval (ISMIR) (pp. 173–174).

    Google Scholar 

  28. Lugmayr, A. (2013). Brief introduction into information systems & management research in media industries. In 2013 IEEE international conference on multimedia and expo workshops (ICMEW) (pp. 1–6). IEEE.

    Google Scholar 

  29. Lugmayr, A., Risse, T., Stockleben, B., Laurila, K., & Kaario, J. (2009). Semantic ambient media—An introduction. Multimedia Tools and Applications, 44(3), 337–359.

    Article  Google Scholar 

  30. Bachmayer, S., Lugmayr, A., & Kotsis, G. (2010). Convergence of collaborative web approaches and interactive TV program formats. International Journal of Web Information Systems, 6(1), 74–94.

    Article  Google Scholar 

  31. Jose, P. T., Miglani, S., & Yadav, S. (2014). Human computer interaction: Analysis and journey through eras. International Journal of Computer Science and Mobile Computing, 3(4), 653–650.

    Google Scholar 

  32. Pantic, M., Nijholt, A., Pentland, A., & Huanag, T. S. (2008). Human-centred intelligent human? Computer interaction (HCI2): How far are we from attaining it? International Journal of Autonomous and Adaptive Communications Systems, 1(2), 168–187.

    Article  Google Scholar 

  33. Picard, R. W. (2000). Toward computers that recognize and respond to user emotion. IBM Systems Journal, 39, 705–719.

    Article  Google Scholar 

  34. Tao, J., & Tan, T. (2005). Affective computing: A review (Vol. 1). Berlin, Heidelberg: Springer.

    Google Scholar 

  35. Ekman, P. (1992). An argument for basic emotions. Cognition and Emotion, 6, 169–200.

    Article  Google Scholar 

  36. Pesek, M., Godec, P., Poredos, M., Strle, G., Guna, J., Stojmenova, E., et al. (2014). Introducing a dataset of emotional and color responses to music. In Proceedings of the international conference on music information retrieval (ISMIR) (pp. 355–360), Taipei.

    Google Scholar 

  37. Albert, W., & Tullis, T. (2013). Measuring the user experience: Collecting, analyzing, and presenting usability metrics (Google eBook). Newnes.

    Google Scholar 

  38. Hart, S. G. (2006). Nasa-task load index (NASA-TLX); 20 years later. Proceedings of the Human Factors and Ergonomics Society Annual Meeting, 50(9), 904–908.

    Article  Google Scholar 

  39. Griscom, W. S., & Palmer, S. E. (2012). The color of musical sounds: Color associates of harmony and timbre in non-synesthetes. Journal of Vision, 12(9), 74–74.

    Google Scholar 

  40. Palmer, S. E., Schloss, K. B., Zoe, X., & Prado-León, L. R. (2013). Music-color associations are mediated by emotion. Proceedings of the National Academy of Sciences, 110(22), 8836–8841.

    Article  Google Scholar 

  41. Juslin, P. N., & Sloboda, J. A. (2001). Music and emotion: Theory and research. Oxford University Press.

    Google Scholar 

  42. Ou, L.-C., Luo, M. R., Woodcock, A., & Wright, A. (2004). A study of colour emotion and colour preference. Part I: Colour emotions for single colours. Color Research & Application, 29(3), 232–240.

    Google Scholar 

  43. Schmidt, E. M., & Kim, Y. E. (2009). Projection of acoustic features to continuous valence-arousal mood labels via regression. In 10th international society for music information retrieval conference. ISMIR.

    Google Scholar 

  44. Pesek, M., Leonardis, A., & Marolt, M. (2014). A compositional hierarchical model for music information retrieval. In Proceedings of the international conference on music information retrieval (ISMIR) (pp. 131–136), Taipei.

    Google Scholar 

  45. Abdi, H. (2007). The method of least squares. Encyclopedia of measurement and statistics. CA, USA: Thousand Oaks.

    Google Scholar 

  46. Moon, C. B., Kim, L., Lee, H. A., & Kim, B. M. (2013). Analysis of relationships between mood and color for different musical preferences. Color Research & Application, 39(4), 413–423.

    Article  Google Scholar 

  47. Peter, C., & Beale, R. (2008). Affect and emotion in human-computer interaction: From theory to applications. Springer.

    Google Scholar 

  48. Kaminskas, M., & Ricci, F. (2012). Contextual music information retrieval and recommendation: State of the art and challenges. Computer Science Review, 6(2–3), 89–119.

    Article  Google Scholar 

  49. Sodnik, J., Jakus, G., & Tomažič, S. (2011). Multiple spatial sounds in hierarchical menu navigation for visually impaired computer users. International Journal of Human-Computer Studies, 69(1–2), 100–112.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Matevž Pesek .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2016 Springer International Publishing AG

About this chapter

Cite this chapter

Pesek, M., Strle, G., Guna, J., Stojmenova, E., Pogačnik, M., Marolt, M. (2016). Towards a Personalised and Context-Dependent User Experience in Multimedia and Information Systems. In: Lugmayr, A., Stojmenova, E., Stanoevska, K., Wellington, R. (eds) Information Systems and Management in Media and Entertainment Industries. International Series on Computer Entertainment and Media Technology. Springer, Cham. https://doi.org/10.1007/978-3-319-49407-4_8

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-49407-4_8

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-49405-0

  • Online ISBN: 978-3-319-49407-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics