Skip to main content

The HUMAINE Database: Addressing the Collection and Annotation of Naturalistic and Induced Emotional Data

  • Conference paper

Part of the book series: Lecture Notes in Computer Science ((LNIP,volume 4738))

Abstract

The HUMAINE project is concerned with developing interfaces that will register and respond to emotion, particularly pervasive emotion (forms of feeling, expression and action that colour most of human life). The HUMAINE Database provides naturalistic clips which record that kind of material, in multiple modalities, and labelling techniques that are suited to describing it.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Abrilian, S., Devillers, L., Buisine, S., Martin, J.-C.: EmoTV1: Annotation of Real-life Emotions for the Specification of Multimodal Affective Interfaces. In: 11th Int. Conf. Human-Computer Interaction (HCII 2005). Electronic proceedings, LEA, Las Vegas, USA (2005)

    Google Scholar 

  2. Ang, J., Dhillon, R., Krupski, A., Shriberg, E., Stolcke, A.: Prosody-based automatic detection of annoyance and frustration in human–computer dialog. In: Proceedings ICSLP, Denver, Colorado (2002)

    Google Scholar 

  3. Batliner, A., Hacker, C., Steidl, S., Noth, E., Haas, J.: From emotion to interaction: Lessons learned from real human–machine dialogues. In: André, E., Dybkjær, L., Minker, W., Heisterkamp, P. (eds.) ADS 2004. LNCS (LNAI), vol. 3068, pp. 1–12. Springer, Heidelberg (2004)

    Google Scholar 

  4. Bänziger, T., Tran, V., Scherer, K.R.: The Geneva Emotion Wheel: A tool for the verbal report of emotional reactions. Bari, Italy (2005)

    Google Scholar 

  5. Batliner, A., Fischer, K., Huber, R., Spilker, J., Nöth, E.: How to find trouble in communication. Speech Communication 40, 117–143 (2003)

    Article  MATH  Google Scholar 

  6. Campbell, N.: Recording and storing of speech data. In: Proceedings LREC (2002)

    Google Scholar 

  7. Cowie, R., Douglas-Cowie, E., Apolloni, B., Taylor, J., Romano, A., Fellenz, W.: What a neural net needs to know about emotion words. In: Mastorakis, N. (ed.) Computational Intelligence and Applications. World Scientific Engineering Society, pp. 109–114 (1999)

    Google Scholar 

  8. Cowie, R., Douglas-Cowie, E., Savvidou, S., McMahon, E., Sawey, M., Schröder, M.: ’Feeltrace’: an instrument for recording perceived emotion in real time. In: Proceedings of the ISCA Workshop on Speech and Emotion, pp. 19–24 (2000)

    Google Scholar 

  9. Devillers, L., Cowie, R., Martin, J.-C., Douglas-Cowie, E., Abrilian, S., McRorie, M.: Real life emotions in French and English TV video clips: an integrated annotation protocol combining continuous and discrete approaches. In: 5th international conference on Language Resources and Evaluation (LREC 2006), Genoa, Italy (2006)

    Google Scholar 

  10. Douglas-Cowie, E., Campbell, N., Roach, P.: Emotional speech: Towards a new generation of databases. Speech Communication 40(1–2), 33–60 (2003)

    Article  MATH  Google Scholar 

  11. Douglas-Cowie, E., et al.: The description of naturally occurring emotional speech. In: Proceedings of 15th International Congress of Phonetic Sciences, Barcelona (2003)

    Google Scholar 

  12. France, D., Shiavi, R., Silverman, S., Silverman, M., Wilkes, D.: Acoustical properties of speech as indicators of depression and suicidal risk. IEEE Transactions on Biomedical Engineering 47(7) (2000)

    Google Scholar 

  13. Greasley, P., Sherrard, C., Waterman, M.: Emotion in language and speech: Methodological issues in naturalistic approaches. Language and Speech 43, 355–375 (2000)

    Article  Google Scholar 

  14. Ioannou, S V., Raouzaiou, A T., Tzouvaras, V A., Mailis, T P., Karpouzis, K C., Kollias., S D: Emotion recognition through facial expression analysis based on a neurofuzzy network. Neural Networks 18, 423–435 (2005)

    Article  Google Scholar 

  15. Juslin, P., Laukka, P.: Communication of emotions in vocal expression and music performance. Psychological Bulletin 129(5), 770–814 (2002)

    Article  Google Scholar 

  16. Kienast, M., Sendlmeier, W.F.: Acoustical analysis of spectral and temporal changes in emotional speech. In: Cowie, R., Douglas, E., Schroeder, M. (eds.) Speech and emotion: Proc ISCA workshop. Newcastle, Co. Down, pp. 92–97 (September 2000)

    Google Scholar 

  17. Kipp, M.: Anvil - A Generic Annotation Tool for Multimodal Dialogue. In: 7th European Conference on Speech Communication and Technology (Eurospeech 2001), Aalborg, Danemark (2001), http://www.dfki.uni-sb.de/~kipp/research/index.html

  18. Kipp, M.: Gesture Generation by Imitation. From Human Behavior to Computer Character Animation. Florida, Boca Raton (2004), http://www.dfki.de/~kipp/dissertation.html

  19. Kita, S., van Gijn, I., van der Hulst, H.: Movement phases in signs and co-speech gestures, and their transcription by human coders. In: Wachsmuth, I., Fröhlich, M. (eds.) Gesture and Sign Language in Human-Computer Interaction. LNCS (LNAI), vol. 1371, Springer, Heidelberg (1998)

    Chapter  Google Scholar 

  20. Leinonen, L., Hiltunen, T.: Expression of emotional-motivational connotations with a one-word utterance. Journ Acoustical Society of America 102(3), 1853–1863 (1997)

    Article  Google Scholar 

  21. Martin, J.-C., Abrilian, S., Devillers, L.: Annotating Multimodal Behaviors Occurring during Non Basic Emotions. In: Tao, J., Tan, T., Picard, R.W. (eds.) ACII 2005. LNCS, vol. 3784, Springer, Heidelberg (2005), http://www.affectivecomputing.org/2005

    Chapter  Google Scholar 

  22. McMahon, E., Cowie, R., Kasderidis, S., Taylor, J., Kollias, S.: What chance that a DC could recognise hazardous mental states from sensor outputs? In: Proc, DC Tales conference, Sanotrini (June 2003)

    Google Scholar 

  23. McNeill, D.: Hand and mind - what gestures reveal about thoughts. University of Chicago Press, IL (1992)

    Google Scholar 

  24. McNeill, D.: Gesture and Thought. The University of Chicago Press, Chicago (2005)

    Google Scholar 

  25. Sander, D., Grandjean, D., Scherer, K.: A systems approach to appraisal mechanisms in emotion. Neural Networks 18, 317–352 (2005)

    Article  Google Scholar 

  26. Scherer, K.R., et al.: Preliminary plans for exemplars: Theory HUMAINE deliverable D3c (2004), http://emotion-research.net/deliverables/D3c.pdf

  27. Schröder, M., Devillers, L., Karpouzis, K., Martin, J.-C., Pelachaud, C., Peter, C., Pirker, H., Schuller, B., Tao, J., Wilson, I.: What should a generic emotion markup language be able to represent? In: Paiva, A., Prada, R., Picard, R.W (eds.) ACII 2007. LNCS, vol. 4738, pp. 440–451. Springer, Heidelberg (2007)

    Google Scholar 

  28. Whissell, C.: The dictionary of affect in language. In: Plutchnik, R. (ed.) Emotion: Theory and research, pp. 113–131. Harcourt Brace, New York (1989)

    Google Scholar 

  29. Yacoub, S., Simske, S., Lin, X., Burns, J.: Recognition of emotions in interactive voice response systems. In: Proceedings of the Eurospeech, Geneva (2003)

    Google Scholar 

  30. Zara, A., Maffiolo, V., Martin, J.C., Devillers, L.: Collection and Annotation of a Corpus of Human-Human Multimodal Interactions: Emotion and Others Anthropomorphic Characteristics. ACII (submitted, 2007)

    Google Scholar 

  31. Zara, A.: Modélisation des Interactions Multimodales Emotionnelles entre Utilisateurs et Agents Animés.Rapport de stage de Master. Ecole Doctorale Paris XI. LIMSI-CNRS (8 September, 2006)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Ana C. R. Paiva Rui Prada Rosalind W. Picard

Rights and permissions

Reprints and permissions

Copyright information

© 2007 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Douglas-Cowie, E. et al. (2007). The HUMAINE Database: Addressing the Collection and Annotation of Naturalistic and Induced Emotional Data. In: Paiva, A.C.R., Prada, R., Picard, R.W. (eds) Affective Computing and Intelligent Interaction. ACII 2007. Lecture Notes in Computer Science, vol 4738. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-74889-2_43

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-74889-2_43

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-74888-5

  • Online ISBN: 978-3-540-74889-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics