Advertisement

Auditory-Induced Emotion: A Neglected Channel for Communication in Human-Computer Interaction

  • Ana Tajadura-Jiménez
  • Daniel Västfjäll
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4868)

Abstract

Interpreting and responding to affective states of a user is crucial for future intelligent systems. Until recently, the role of sound in affective responses has been frequently ignored. This article provides a brief overview of the research targeting affective reactions to everyday, ecological sounds. This research shows that the subjective interpretation and meaning that listeners attribute to sound, the spatial dimension, or the interactions with other sensory modalities, are as important as the physical properties of sound in evoking an affective response. Situation appraisal and individual differences are also discussed as factors influencing the emotional reactions to auditory stimuli. A study with heartbeat sounds exemplifies some of the introduced ideas and research methodologies, and shows the potential of sound in inducing emotional states.

Keywords

Auditory induced-emotion sound quality self-representation sounds embodiment emotional intelligence 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Lang, P.J., Bradley, M.M., Cuthbert, B.N.: Emotion, Attention, and the Startle reflex. Psychological Review 97, 377–395 (1990)CrossRefGoogle Scholar
  2. 2.
    Phelps, E.A., LeDoux, J.E.: Contributions of the Amygdala to Emotion Processing: From Animal Models to Human Behavior. Neuron 48, 175–187 (2005)CrossRefGoogle Scholar
  3. 3.
    Reeves, B., Nass, C.: The Media Equation: How People Treat Computers, Television, and New Media Like Real People and Places. Cambridge University Press, New York (1996)Google Scholar
  4. 4.
    Picard, R.W., Daily, S.B.: Evaluating Affective Interactions: Alternatives to Asking What Users Feel. In: CHI Workshop on Evaluating Affective interfaces: Innovative Approaches, Portland (April 2005)Google Scholar
  5. 5.
    Juslin, P., Västfjäll, D.: All Emotions are not Created Equal: Mechanism Underlying Musical Emotions. Behavioral Brain Sciences (in press)Google Scholar
  6. 6.
    Sköld, A., Bergman, P., Västfjäll, D., Tajadura-Jiménez, A., Larsson, P.: Emotional Reactions to Information and Warning Sounds. Acta Acustica (submitted)Google Scholar
  7. 7.
    Västfjäll, D., Kleiner, M.: Emotion in Product Sound Design. In: Proceedings of Journées Design Sonore, Paris, March 20-21 (2002)Google Scholar
  8. 8.
    Västfjäll, D., Tajadura-Jiménez, A., Väljamäe, A., Juslin, P.: Non-Vocal, Non-Musical Determinants of Auditory Induced Emotions (in preparation) Google Scholar
  9. 9.
    Jäncke, L., Vogt, J., Musial, F., Lutz, K., Kalveram, K.T.: Facial EMG Responses to Auditory Stimuli. International Journal of Psychophysiology 22, 85–96 (1996)CrossRefGoogle Scholar
  10. 10.
    De Gelder, B., Bertelson, P.: Multisensory Integration, Perception and Ecological Validity. Trends in Cognitive Sciences 7, 460–467 (2003)CrossRefGoogle Scholar
  11. 11.
    Levenson, R.W.: Human emotions: A functional view. In: Ekman, P., Davidson, R.J. (eds.) The Nature of Emotion: Fundamental Questions, pp. 123–126. Oxford University Press, New York (1994)Google Scholar
  12. 12.
    Lazarus, R.S.: Emotion and Qdaptation. Oxford University Press, New York (1991)Google Scholar
  13. 13.
    Shiv, B., Fedorikhin, A.: Heart and Mind in Conflict: Interplay of Affect and Cognition in Consumer Decision Making. Journal of Consumer Research 26, 278–282 (1999)CrossRefGoogle Scholar
  14. 14.
    Ekman, P., Friesen, W.V., Ancoli, S.: Facial Signs of Emotional Experience. Journal of Personality and Social Psychology 39, 1125–1134 (1980)CrossRefGoogle Scholar
  15. 15.
    LeDoux, J.E.: The Emotional Brain: The mysterious Underpinnings of Emotional Life. Simon & Schuster, New York (1996)Google Scholar
  16. 16.
    Levenson, R.W.: Emotion and the Autonomic Nervous System: A Prospectus for Research on Autonomic Specificity. In: Wagner, H. (ed.) Social Psychology: Perspectives of Theory and Clinical Applications, pp. 17–42. Wiley, London (1988)Google Scholar
  17. 17.
    Wundt, W.: Lectures on Human and Animal Psychology. Macmillan, New York (1896) (J.E. Creighton & E.B Titchener, Trans.)Google Scholar
  18. 18.
    Osgood, C., Suci, G., Tannenbaum, P.: The Measurement of Meaning. University of Illinois Press, Urbana (1957)Google Scholar
  19. 19.
    Lang, P.J.: The Emotion Probe: Studies of Motivation and Attention. American Psychologist 50, 372–385 (1995)CrossRefGoogle Scholar
  20. 20.
    Russell, J.A.: The Circumplex Model of Affect. Journal of Personality and Social Psychology 39, 1161–1178 (1980)CrossRefGoogle Scholar
  21. 21.
    Palen, L., Bølen, S.: Don’t Get Emotional. In: Peter, C., Beale, R. (eds.) Affect and Emotion in Human-Computer Interaction. LNCS, vol. 4868. Springer, Heidelberg (2007)Google Scholar
  22. 22.
    Bainbridge, W.S.: Computational Affective Sociology. In: Peter, C., Beale, R. (eds.) Affect and Emotion in Human-Computer Interaction. LNCS, vol. 4868. Springer, Heidelberg (2008)Google Scholar
  23. 23.
    Lichtenstein, A., Oehme, A., Kupschick, S., Jürgensohn, T.: Comparing Two Emotion Models for Deriving Affective States from Physiological Data. In: Peter, C., Beale, R. (eds.) Affect and Emotion in Human-Computer Interaction. LNCS, vol. 4868. Springer, Heidelberg (2008)Google Scholar
  24. 24.
    Mahlke, S., Minge, M.: Consideration of Multiple Components of Emotions in Human-Technology Interaction. In: Peter, C., Beale, R. (eds.) Affect and Emotion in Human-Computer Interaction. LNCS, vol. 4868. Springer, Heidelberg (2008)Google Scholar
  25. 25.
    Blauert, J., Jekosch, U.: Sound-Quality Evaluation: A Multi-Layered Problem. Acta Acustica 83, 747–753 (1997)Google Scholar
  26. 26.
    Guski, R.: Psychological Methods for Evaluating Sound Quality and Assessing Acoustic Information. Acta Acustica 83, 765–774 (1997)Google Scholar
  27. 27.
    Widman, U.: Aurally Adequate Evaluation of Sounds. In: Proc. of Euro Noise, vol. 98, pp. 29–46 (1998)Google Scholar
  28. 28.
    Todd, N.: Evidence for a Behavioral Significance of Saccular Acoustic Sensitivity in Humans. Journal of the Acoustical Society of America 110, 380–390 (2001)CrossRefGoogle Scholar
  29. 29.
    Sokolov, E.N.: Perception and the Conditioned Reflex. Pergamon Press, Oxford (1963)Google Scholar
  30. 30.
    Dimberg, U.: Facial Electromyography and Emotional Reactions. Psychophysiology 27(5), 481–494 (1990)CrossRefGoogle Scholar
  31. 31.
    Kjellberg, A., Skoldstrom, B., Tesarz, M., Dallner, M.: Facial EMG Responses to Noise. Percept Mot Skills 79(3 Pt 1), 1203–1216 (1994)Google Scholar
  32. 32.
    Bisping, R.: Emotional Effect of Car Interior Sounds: Pleasantness and Power and their Relation to Acoustic Key Features. SAE paper 951284, 1203–1209 (1995)Google Scholar
  33. 33.
    Bisping, R.: Car Interior Sound Quality: Experimental Analysis by Synthesis. Acta Acustica 83, 813–818 (1997)Google Scholar
  34. 34.
    Björk, E.A.: The Perceived Quality of Natural Sounds. Acustica 57, 185–188 (1985)Google Scholar
  35. 35.
    Bradley, M.M., Lang, P.J.: Affective Reactions to Acoustic Stimuli. Psychophysiology 37, 204–215 (2000)CrossRefGoogle Scholar
  36. 36.
    Bradley, M.M., Lang, P.J.: International Affective Digitized Sounds (IADS): Stimuli, Instruction Manual and Affective Ratings (Tech. Rep. No. B-2). Gainesville, FL: The Center for Research in Psychophysiology. University of Florida (1999)Google Scholar
  37. 37.
    Schröder, M.: Emotional Speech Synthesis: A Review. In: Proceedings of Eurospeech 2001, Scandinavia (2001)Google Scholar
  38. 38.
    Hermann, T., Ritter, H.: Sound and Meaning in Auditory Data Display. Proceedings of the IEEE 92(4) (2004)Google Scholar
  39. 39.
    Vogt, T., André, E., Wagner, J.: Automatic Recognition of Emotions from Speech: a Review of the Literature and Recommendations for Practical Realisation. In: Peter, C., Beale, R. (eds.) Affect and Emotion in Human-Computer Interaction. LNCS, vol. 4868. Springer, Heidelberg (2008)Google Scholar
  40. 40.
    Jones, C., Deeming, A.: Affective Human-Robotic Interaction. In: Peter, C., Beale, R. (eds.) Affect and Emotion in Human-Computer Interaction. LNCS, vol. 4868. Springer, Heidelberg (2008)Google Scholar
  41. 41.
    Jones, C., Sutherland, J.: Acoustic Emotion Recognition for Affective Computer Gaming. In: Peter, C., Beale, R. (eds.) Affect and Emotion in Human-Computer Interaction. LNCS, vol. 4868. Springer, Heidelberg (2008)Google Scholar
  42. 42.
    Wexler, B.E., Warrenburg, S., Schwartz, G.E., Janer, L.D.: EEG and EMG Responses to Emotion-Evoking Stimuli Processed Without Conscious Awareness. Neuropsychologia 30(12), 1065–1079 (1992)CrossRefGoogle Scholar
  43. 43.
    Hietanen, J.K., Surakka, V., Linnankoski, L.: Facial Electromyographic Responses to Vocal Affect Expressions. Psychophysiology, 35, 530–536 (1998) CrossRefGoogle Scholar
  44. 44.
    Scherer, K.R.: Acoustic Concomitants of Emotion Dimensions: Judging Affect from Synthesized Tone Sequences. In: Weitz, S. (ed.) Nonverbal Communication: Readings with Commentary, pp. 249–253. Oxford University Press, New York (1974)Google Scholar
  45. 45.
    Väljamäe, A., Larsson, P., Västfjäll, D., Kleiner, M.: Sound Representing Self-Motion in Virtual Environments Enhances Linear Vection. Presence: Teleoperators and Virtual Environments (to appear)Google Scholar
  46. 46.
    Whalen, D.H., Hoequist, C.E., Sheffert, S.M.: The Effects of Breath Sounds on the Perception of Synthetic Speech. Journal of Acoustic Society of America 97, 3147–3153 (1995)CrossRefGoogle Scholar
  47. 47.
    Albers, S.: Eating Mindfully: How to End Mindless Eating and Enjoy a Balanced Relationship with Food. New Harbinger Publications (2003)Google Scholar
  48. 48.
    Woll, S.B., McFall, M.E.: The Effects of False Feedback on Attributed Arousal and Rated Attractiveness in Female Subjects. Journal of Personality 47, 214–229 (1979)CrossRefGoogle Scholar
  49. 49.
    Scherer, K.R., Zentner, M.R.: Emotional Effects of Music: Production Rules. In: Juslin, P.N., Sloboda, J.A. (eds.) Music and Emotion: Theory and Research. Oxford University Press, New York (2001)Google Scholar
  50. 50.
    James, W.: The Principles of Psychology. Holt, New York (1890)Google Scholar
  51. 51.
    Schachter, S., Singer, J.E.: Cognitive, Social and Physiological Determinants of Emotional State. Psychol. Review 69, 379–399 (1962)CrossRefGoogle Scholar
  52. 52.
    Juslin, P.N., Västfjäll, D.: Lost in a Feeling? A Model that can Guide the Study of Music and Emotion (submitted)Google Scholar
  53. 53.
    Thaut, M.H.: Neuropsychological Processes in Music Perception and their Relevance in Music Therapy. In: Unkeler, R.F. (ed.) Music Therapy in the Treatment of Adults with Mental disorders, pp. 3–31. Schirmer books, New York (1990)Google Scholar
  54. 54.
    Juslin, P.N., Sloboda, J.A.: Music and Emotion. In: Theory and Research. Oxford University Press, New York (2001)Google Scholar
  55. 55.
    Loviscach, J., Oswald, D.: In the Mood: Tagging Music with Affects. In: Peter, C., Beale, R. (eds.) Affect and Emotion in Human-Computer Interaction. LNCS, vol. 4868. Springer, Heidelberg (2008)Google Scholar
  56. 56.
    Västfjäll, D.: The Subjective Sense of Presence, Emotion Recognition, and Experienced Emotions in Auditory Virtual Environments. CyberPsychology & Behavior 6(2), 181–188 (2003)CrossRefGoogle Scholar
  57. 57.
    Freeman, J., Avons, S.E., Pearson, D.E., Ijsselsteijn, W.A.: Effects of Sensory Information and Prior Experience on Direct Subjective Ratings of Presence. Presence: Teleoperators and Virtual Environments 8(1), 1–13 (1999)CrossRefGoogle Scholar
  58. 58.
    Frijda, N.H.: Emotions are Functional, Most of the Time. In: Ekman, P., Davidson, R.J. (eds.) The nature of emotion, pp. 112–122. Oxford University Press, New York (1994)Google Scholar
  59. 59.
    Hendrix, C., Barfield, W.: The sense of Presence Within Auditory Virtual Environments. Presence: Teleoperators and Virtual Environments 3, 290–301 (1996)Google Scholar
  60. 60.
    Larsson, P., Väjamäe, V.D., Kleiner, M.: Auditory Induced Presence in Mediated Environments and Related Technology. In: Biocca, F., IJsselsteijn, W.A., Freeman, J.J. (eds.) Handbook of Presence. Lawrence Erlbaum, Mahwah (in press)Google Scholar
  61. 61.
    Kallinen, K., Ravaja, N.: Comparing Speakers versus Headphones in Listening to News from a Computer: Individual Differences and Psychophysiological Responses. Computers in Human Behavior 23, 303–317 (2007)CrossRefGoogle Scholar
  62. 62.
    Spence, C., Driver, J. (eds.): Crossmodal Space and Cross-modal Attention. Oxford University Press, Oxford (2004)Google Scholar
  63. 63.
    Poliakoff, E., Miles, E., Li, X., Blanchette, I.: The effect of Visual Threat on Spatial Attention to Touch. Cognition 102(3), 405–414 (2007)CrossRefGoogle Scholar
  64. 64.
    Västfjäll, D.: Influences of Current Mood and Noise Sensitivity on Judgments of noise annoyance. The Journal of Psychology 136, 357–370 (2002)Google Scholar
  65. 65.
    Tajadura-Jiménez, A., Väljamäe, A., Västfjäll, D.: Self-Representation in Mediated Environments: The Experience of Emotions Modulated by Auditory-Vibrotactile Heartbeat (in press)Google Scholar
  66. 66.
    Lang, P.J., Bradley, M.M., Cuthbert, B.N.: International Affective Picture System (IAPS): Affective Ratings of Pictures and Instruction Manual. Technical Report A-6. Univ. Florida, Gainesville, FL (2005)Google Scholar
  67. 67.
    Lang, P.J.: Behavioral Treatment and Bio-Behavioral Assessment: Computer Applications. In: Sidowski, J.B., Johnson, J.H., Williams, T.A. (eds.) Technology in Mental Health Care Delivery Systems, pp. 119–137. Ablex Publishing, Norwood (1980)Google Scholar
  68. 68.
    Slater, M., Usoh, M.: Body Centred Interaction in Immersive Virtual Environments. In: Magnenat Thalmann, N., Thalmann, D. (eds.) Artificial Life and Virtual Reality, pp. 125–148. John Wiley and Sons, Chichester (1994)Google Scholar
  69. 69.
    Watson, D., Tellegen, A.: Toward a Consensual Structure of Mood. Psychological Bulletin 98, 219–235 (1985)CrossRefGoogle Scholar
  70. 70.
    Graf, C., Niebuhr, S., Kohler, K.: Enhancing Business Software through Fun-of-Use: A Pattern-based approach. In: Position Paper for Workshop on The Role of Emotion in HCI 2006, London, September 12-15 (2006)Google Scholar
  71. 71.
    Walkinshaw, O.: A Photo a Day: Is it Work, Rest or Play? In: Position Paper for Workshop on The Role of Emotion in HCI 2006, London (September 12-15, 2006)Google Scholar
  72. 72.
    Harbich, S., Hassenzahl, M.: Beyond Task Completition in the Workplace: Execute, Engage, Evolve, Expand. In: Peter, C., Beale, R. (eds.) Affect and Emotion in Human-Computer Interaction. LNCS, vol. 4868. Springer, Heidelberg (2008)Google Scholar
  73. 73.
    Creed, C., Beale, R.: Simulated Emotion in Affective Embodied Agents. In: Peter, C., Beale, R. (eds.) Affect and Emotion in Human-Computer Interaction. LNCS, vol. 4868. Springer, Heidelberg (2008)Google Scholar
  74. 74.
    Millard, N., Hole, L.: In the Moodie: Using Affective Widgets to Help Contact Centre Advisors Fight Stress. In: Peter, C., Beale, R. (eds.) Affect and Emotion in Human-Computer Interaction. LNCS, vol. 4868. Springer, Heidelberg (2008)Google Scholar
  75. 75.
    Pertaub, D.P., Slater, M., Barker, C.: An Experiment on Public Speaking Anxiety in Response to Three Different Types of Virtual Audience. Presence: Teleoperators and Virtual Environments 11, 68–78 (2002)CrossRefGoogle Scholar
  76. 76.
    Shick, A.: Duckie and Dudle: a preliminary demonstration using affective display to communicate task status. In: Position Paper for Workshop on the Role of Emotion in HCI 2006: Engage, London, September 12-15 (2006)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2008

Authors and Affiliations

  • Ana Tajadura-Jiménez
    • 1
  • Daniel Västfjäll
    • 1
    • 2
  1. 1.Division of Applied AcousticsChalmers University of TechnologyGöteborgSweden
  2. 2.Department of PsychologyGöteborg UniversityGöteborgSweden

Personalised recommendations