Abstract
Cognitive goals – i.e. the intention to utter a sentence and to produce co-speech facial and hand-arm gestures – as well as the sensorimotor realization of the intended speech, co-speech facial, and co-speech hand-arm actions are modulated by the emotional state of the speaker. In this review paper it will be illustrated how cognitive goals and sensorimotor speech, co-speech facial, and co-speech hand-arm actions are modulated by emotional states of the speaker, how emotional states are perceived and recognized by interlocutors in the context of face-to-face communication, and which brain regions are responsible for production and perception of emotions in face-to-face communication.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Levelt, W.J.M., Roelofs, A., Meyer, A.S.: A theory of lexical access in speech production. Behavioral and Brain Sciences 22, 1–75 (1999)
Levelt, W.J.M.: Models of word production. Trends in Cognitive Sciences 3, 223–232 (1999)
Guenther, F.H.: Cortical interactions underlying the production of speech sounds. Journal of Communication Disorders 39, 350–365 (2006)
Guenther, F.H., Ghosh, S.S., Tourville, J.A.: Neural modeling and imaging of the cortical interactions underlying syllable production. Brain and Language 96, 280–301 (2006)
Kröger, B.J., Kannampuzha, J., Neuschaefer-Rube, C.: Towards a neurocomputational model of speech production and perception. Speech Communication 51, 793–809 (2009)
Halberstadt, J.B., Niedenthal, P.M., Kushner, J.: Resolution of lexical ambiguity by emotional state. Psychological Science 6, 278–282 (1995)
Bänziger, T., Scherer, K.R.: The role of intonation on emotional expressions. Speech Communication 46, 252–267 (2005)
Scherer, K.R.: Vocal communication of emotion: A review of research paradigms. Speech Communication 40, 227–256 (2003)
Ekman, P., Oster, H.: Facial expressions of emotion. Annual Review of Psychology 30, 527–554 (1979)
Castellano, G., Villalba, S.D., Camurri, A.: Recognising human emotions from body movement and gesture dynamics. In: Paiva, A.C.R., Prada, R., Picard, R.W. (eds.) ACII 2007. LNCS (LNAI), vol. 4738, pp. 71–82. Springer, Heidelberg (2007)
Kröger, B.J., Kopp, S., Lowit, A.: A model for production, perception, and acquisition of actions in face-to-face communication. Cognitive Processing 11, 187–205 (2010)
Kröger, B.J., Birkholz, P., Kaufmann, E., Neuschaefer-Rube, C.: Beyond vocal tract actions: speech prosody and co-verbal gesturing in face-to-face communication. In: Kröger, B.J., Birkholz, P. (eds.) Studientexte zur Sprachkommunikation: Elektronische Sprachsignalverarbeitung 2011, pp. 195–204. TUDpress, Dresden (2011)
Kendon, A.: Gesture: Visible Action as Utterance. Cambridge University Press, New York (2004)
Kopp, S., Wachsmuth, I.: Synthesizing multimodal utterances for conversational agents. Journal of Computer Animation and Virtual Worlds 15, 39–51 (2004)
Ekman, P., Friesen, W.V.: Facial Action Coding System. Consulting Psychologists Press, Palo Alto (1978)
Cohn, J.F., Ambadar, Z., Ekman, P.: Observer-based measurement of facial expression with the facial action coding system. In: Coan, J.A., Allen, J.J.B. (eds.) Handbook of Emotion Elicitation and Assessment, pp. 203–221. Oxford University Press US, New York (2007)
Kröger, B.J., Birkholz, P., Kannampuzha, J., Kaufmann, E., Mittelberg, I.: Movements and holds in fluent sentence production of American Sign Language: The action-based approach. Cognitive Computation 3, 449–465 (2011)
Kröger, B.J., Birkholz, P.: A gesture-based concept for speech movement control in articulatory speech synthesis. In: Esposito, A., Faundez-Zanuy, M., Keller, E., Marinaro, M. (eds.) COST Action 2102. LNCS (LNAI), vol. 4775, pp. 174–189. Springer, Heidelberg (2007)
Schmidt, K.L., Ambadar, Z., Cohn, J.F., Reed, L.I.: Movement differences between deliberate and spontaneous facial expressions: Zygomaticus major action in smiling. Journal of Nonverbal Behavior 30, 37–52 (2006)
Arbib, M.A., Fellous, J.M.: Emotions: from brain to robot. Trends in Cognitive Sciences 8, 554–561 (2004)
Breazeal, C.: Emotion and sociable humanoid robots. International Journal of Human-Computer Studies 59, 119–155 (2003)
Ekman, P.: An argument for basic emotions. Cognition and Emotion 6, 169–200 (1992)
LeDoux, J.E.: Emotion circuits in the brain. Annual Reviews of Neuroscience 23, 155–184 (2000)
Lazarus, R.S.: Cognition and motivation in emotion. American Psychologist 46, 352–367 (1991)
Pessoa, L., Adolphs, R.: Emotion processing and the amygdala: from a ‘low road’ to ‘many roads’ of evaluating biological significance. Nature Reviews Neuroscience 11, 773–782 (2010)
Whalen, P.J., Raila, H., Bennett, R., Mattek, A., Brown, A., Taylor, J., van Tieghem, M., Tanner, A., Miner, M., Palme, A.: Neuroscience and facial expressions of emotion: the role of amygdala-prefrontal interactions. Emotion Review 5, 78–83 (2013)
Brück, C., Kreifelts, B., Ethofer, T., Wildgruber, D.: Emotional voices: the tone of (true) feelings. In: Armony, J., Vuilleumier, P. (eds.) The Cambridge Handbook of Human Affective Neuroscience, pp. 256–285. Cambridge University Press, New York (2013)
Kesler-West, M.L., Andersen, A.H., Smith, C.D., Avison, M.J., Davis, C.E., Kryscio, R.J., Blonder, L.X.: Neural substrates of facial emotion processing using fMRI. Cognitive Brain Research 11, 213–226 (2001)
Mitsuyoshi, S., Monnma, F., Tanaka, Y., Minami, T., Kato, M., Murata, T.: Identifying neural components of emotion in free conversation with fMRI. In: Defense Science Research Conference and Expo, Singapore, pp. 1–4 (2011), doi:10.1109/DSR.2011.6026845
Aziz-Zadeh, L., Sheng, T., Gheytanchi, A.: Common premotor regions for the perception and production of prosody and correlations with empathy and prosodic ability. PLoS ONE 5, e8759, 1-7 (2010), doi:10.1371/journal.pone.0008759
Bauer, D., Kannampuzha, J., Kröger, B.J.: Articulatory Speech Re-Synthesis: Profiting from natural acoustic speech data. In: Esposito, A., Vích, R. (eds.) Cross-Modal Analysis of Speech, Gestures, Gaze and Facial Expressions. LNCS, vol. 5641, pp. 344–355. Springer, Heidelberg (2009)
Martin, O., Kotsia, I., Macq, B., Pitas, I.: The eNTERFACE05 Audio-Visual Emotion Database. In: First IEEE Workshop on Multimedia Database Management, Atlanta, USA (2006), doi:10.1109/ICDEW.2006.145
Lücking, A., Bergmann, K., Hahn, F., Kopp, S., Rieser, H.: Data-based analysis of speech and gesture: the Bielefeld Speech and Gesture Alignment corpus (SaGA) and its applications. Journal on Multimodal User Interfaces (2012), doi:10.1007/s12193-012-0106-8
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2014 Springer International Publishing Switzerland
About this paper
Cite this paper
Kröger, B.J. (2014). Modulation of Cognitive Goals and Sensorimotor Actions in Face-to-Face Communication by Emotional States: The Action-Based Approach. In: Bassis, S., Esposito, A., Morabito, F. (eds) Recent Advances of Neural Network Models and Applications. Smart Innovation, Systems and Technologies, vol 26. Springer, Cham. https://doi.org/10.1007/978-3-319-04129-2_38
Download citation
DOI: https://doi.org/10.1007/978-3-319-04129-2_38
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-04128-5
Online ISBN: 978-3-319-04129-2
eBook Packages: EngineeringEngineering (R0)