Advertisement

Springer Nature is making SARS-CoV-2 and COVID-19 research free. View research | View latest news | Sign up for updates

Beyond words: evidence for automatic language–gesture integration of symbolic gestures but not dynamic landscapes

  • 408 Accesses

  • 6 Citations

Abstract

Understanding actions based on either language or action observation is presumed to involve the motor system, reflecting the engagement of an embodied conceptual network. We examined how linguistic and gestural information were integrated in a series of cross-domain priming studies. We varied the task demands across three experiments in which symbolic gestures served as primes for verbal targets. Primes were clips of symbolic gestures taken from a rich set of emblems. Participants responded by making a lexical decision to the target (Experiment 1), naming the target (Experiment 2), or performing a semantic relatedness judgment (Experiment 3). The magnitude of semantic priming was larger in the relatedness judgment and lexical decision tasks compared to the naming task. Priming was also observed in a control task in which the primes were pictures of landscapes with conceptually related verbal targets. However, for these stimuli, the amount of priming was similar across the three tasks. We propose that action observation triggers an automatic, pre-lexical spread of activation, consistent with the idea that language–gesture integration occurs in an obligatory and automatic fashion.

This is a preview of subscription content, log in to check access.

Fig. 1
Fig. 2
Fig. 3
Fig. 4

References

  1. Andrews, S., & Heathcote, A. (2001). Distinguishing common and task-specific processes in word identification: a matter of some moment? Journal of Experimental Psychology. Learning, Memory, and Cognition, 27(2), 514–544.

  2. Bates, E., & Dick, F. (2002). Language, gesture, and the developing brain. Developmental Psychobiology, 40(3), 293–310.

  3. Bernardis, P., & Caramelli, N. (2009). Meaning in words, gestures, and mental images. In Proceedings of the 31st annual conference of the Cognitive Science Society (pp. 1693–1697).

  4. Bernardis, P., & Gentilucci, M. (2006). Speech and gesture share the same communication system. Neuropsychologia, 44(2), 178–190.

  5. Bernardis, P., Salillas, E., & Caramelli, N. (2008). Behavioural and neurophysiological evidence of semantic interaction between iconic gestures and words. Cognitive Neuropsychology, 25(7–8), 1114–1128.

  6. Blair, I. V., Urland, G. R., & Ma, J. E. (2002). Using Internet search engines to estimate word frequency. Behavior Research Methods, Instruments, & Computers, 34(2), 286–290.

  7. Brookes, H. (2005). What gestures do: some communicative functions of quotable gestures in conversations among Black urban South Africans. Journal of pragmatics, 37(12), 2044–2085.

  8. Collins, A. M., & Loftus, E. F. (1975). A spreading-activation theory of semantic processing. Psychological Review, 82(6), 407–428.

  9. Corballis, M. C. (1998). Cerebral asymmetry: Motoring on. Trends in Cognitive Sciences, 2(4), 152–158.

  10. Duscherer, K., & Holender, D. (2005). The role of decision biases in semantic priming effects. Swiss Journal of Psychology, 64(4), 249–258.

  11. Frick-Horbury, D., & Guttentag, R. E. (1998). The effects of restricting hand gesture production on lexical retrieval and free recall. The American Journal of Psychology, 111(1), 43–62.

  12. Gallese, V., Fadiga, L., Fogassi, L., & Rizzolatti, G. (1996). Action recognition in the premotor cortex. Brain, 119(2), 593–609.

  13. Gentilucci, M., Bernardis, P., Crisi, G., & Volta, R. D. (2006). Repetitive transcranial magnetic stimulation of Broca’s area affects verbal responses to gesture observation. Journal of Cognitive Neuroscience, 18(7), 1059–1074.

  14. Goodwyn, S. W., Acredolo, L. P., & Brown, C. A. (2000). Impact of symbolic gesturing on early language development. Journal of Nonverbal Behavior, 24(2), 81–103.

  15. Gunter, T. C., & Bach, P. (2004). Communicating hands: ERPs elicited by meaningful symbolic hand postures. Neuroscience Letters, 372, 52–56.

  16. Hauk, O., Johnsrude, I., & Pulvermüller, F. (2004). Somatotopic representation of action words in human motor and premotor cortex. Neuron, 41(2), 301–307.

  17. Holle, H., & Gunter, T. C. (2007). The role of iconic gestures in speech disambiguation: ERP evidence. Journal of Cognitive Neuroscience, 19(7), 1175–1192.

  18. Kang, M. S., Blake, R., & Woodman, G. F. (2011). Semantic analysis does not occur in the absence of awareness induced by interocular suppression. Journal of Neuroscience, 31(38), 13535–13545.

  19. Kelly, S. D., Creigh, P., & Bartolotti, J. (2010a). Integrating speech and iconic gestures in a stroop-like task: Evidence for automatic processing. Journal of Cognitive Neuroscience, 22(4), 683–694.

  20. Kelly, S. D., Özyürek, A., & Maris, E. (2010b). Two sides of the same coin: Speech and gesture mutually interact to enhance comprehension. Psychological Science, 21(2), 260–267.

  21. Kendon, A. (1994). Do gestures communicate? A review. Research on language and social interaction, 27(3), 175–200.

  22. Kendon, A. (2004). Gesture: Visible action as utterance. Cambridge University Press, Cambridge.

  23. Kita, S. (2000). How representational gestures help speaking. In D. McNeill (Ed.), Language and gesture (pp. 162–185). Cambridge: Cambridge University Press.

  24. Kita, S., & Özyürek, A. (2003). What does cross-linguistic variation in semantic coordination of speech and gesture reveal? Evidence for an interface representation of spatial thinking and speaking. Journal of Memory and Language, 48(1), 16–32.

  25. Krauss, M., Chen, Y., & Gottesman, R. F. (2000). Lexical gestures and lexical access: A process model. In D. McNeill (Ed.), Language and gesture (pp. 261–283). New-York: Cambridge.

  26. Krauss, R., & Hadar, U. (1999). The role of speech-related arm/hand gesture in word retrieval. In L. S. Messing & R. Campbell (Eds.), Gesture, speech, and sign (pp. 93–116). Oxford: Oxford University Press.

  27. Lakoff, G. (1987). Women, fire, and dangerous things. Chicago: University of Chicago Press.

  28. Lakoff, G., & Johnson, M. (1999). Philosophy in the flesh: The embodied mind and its challenge to Western thought. New York: Basic books.

  29. Lovseth, K., & Atchley, R. A. (2010). Examining lateralized semantic access using pictures. Brain and Cognition, 72(2), 202–209.

  30. McNamara, T. P. (2005). Semantic priming: Perspectives from memory and word recognition. New York: Psychology Press.

  31. McNeill, D. (1992). Hand and mind. What the hands reveal about thought. Chicago: University of Chicago Press.

  32. Morrel-Samuels, P., & Krauss, R. M. (1992). Word familiarity predicts temporal asynchrony of hand gestures and speech. Learning, Memory, 18(3), 615–622.

  33. Mukamel, R., Ekstrom, A. D., Kaplan, J., Iacoboni, M., & Fried, I. (2010). Single-neuron responses in humans during execution and observation of actions. Current Biology, 20, 750–756.

  34. Neely, J. H. (1991). Semantic priming effects in visual word recognition: A selective review of current findings and theories. In Basic processes in reading: Visual word recognition, 11.

  35. Neely, J. H., & Keefe, D. E. (1989). Semantic context effects on visual word processing: A hybrid prospective/retrospective processing theory. In G. H. Bower (Eds.), The psychology of learning and motivation: Advances in research and theory (Vol. 24, pp. 207–248). New York: Academic Press.

  36. Obermeier, C., Holle, H., & Gunter, T. C. (2011). What iconic gesture fragments reveal about gesture-speech integration: when synchrony is lost, memory can help. Journal of Cognitive Neuroscience, 23(7), 1648–1663.

  37. Oldfield, R. C. (1971). The assessment and analysis of handedness: the Edinburgh inventory. Neuropsychologia, 9(1), 97–113.

  38. Oliveri, M., Finocchiaro, C., Shapiro, K., Gangitano, M., Caramazza, A., & Pascual-Leone, A. (2004). All talk and no action: a transcranial magnetic stimulation study of motor cortex activation during action word production. Journal of Cognitive Neuroscience, 16(3), 374–381.

  39. Perea, M., & Rosa, E. (2002). The effects of associative and semantic priming in the lexical decision task. Psychological Research, 66(3), 180–194.

  40. Pulvermüller, F., Shtyrov, Y., & Ilmoniemi, R. (2005). Brain signatures of meaning access in action word recognition. Journal of Cognitive Neuroscience, 17(6), 884–892.

  41. Rauscher, F. H., Krauss, R. M., & Chen, Y. (1996). Gesture, speech, and lexical access: The role of lexical movements in speech production. Psychological Science, 7(4), 226–231.

  42. Rimé, B., & Schiaratura, L. (1991). Gesture and speech. In R. Feldman & B. Rimé (Eds.), Fundamentals of nonverbal behavior (pp. 239–281). Cambridge: Cambridge University Press.

  43. Rizzolatti, G., & Arbib, M. A. (1998). Language within our grasp. Trends in Neurosciences, 21(5), 188–194.

  44. Rizzolatti, G., Fogassi, L., & Gallese, V. (2001). Neurophysiological mechanisms underlying the understanding and imitation of action. Nature Reviews Neuroscience, 2(9), 661–670.

  45. Schegloff, E. A. (1984). On some gestures’ relation to talk. In J. M. Atkinson & J. Heritage (Eds.), Structures of social action: Studies in conversation analysis (pp. 266–296). Cambridge: Cambridge University Press.

  46. Tettamanti, M., Buccino, G., Saccuman, M. C., Gallese, V., Danna, M., Scifo, P., et al. (2005). Listening to action-related sentences activates fronto-parietal motor circuits. Journal of Cognitive Neuroscience, 17(2), 273–281.

  47. Wu, Y. C., & Coulson, S. (2007). Iconic gestures prime related concepts: An ERP study. Psychonomic Bulletin & Review, 14(1), 57–63.

  48. Wu, Y. J., & Thierry, G. (2010). Chinese-English bilinguals reading English hear Chinese. Journal of Neuroscience, 30, 7646–7651.

  49. Xu, J., Gannon, P. J., Emmorey, K., Smith, J. F., & Braun, A. R. (2009). Symbolic gestures and spoken language are processed by a common neural system. Proceedings of the National Academy of Sciences, 106(49), 20664–20669.

  50. Yap, D. F., So, W. C., Melvin Yap, J. M., Tan, Y. Q., & Teoh, R. L. S. (2011). Iconic gestures prime words. Cognitive Science, 35(1), 171–183.

  51. Zwaan, R. A. (2004). The immersed experiencer: Towards an embodied theory of language comprehension. In B. Ross (Ed.), The Psychology of Learning and Motivation (Vol. 44, pp 35–62). San Diego: Academic Press.

Download references

Acknowledgments

This study was supported by the BSF Grant 2007184 awarded to R. Ivry and M. Lavidor.

Author information

Correspondence to Michal Lavidor.

Appendices

Appendix 1

See Table 3.

Table 3 Prime–target pairs examples—original stimuli were presented in Hebrew, English translations were added for clarification

Appendix 2

Stimuli characteristics

See Tables 4, 5, 6, 7 and 8.

Table 4 Prime agreement (conventionality or meaningless agreement score), semantic agreement, lexical agreement, target length and frequency, according to the experimental condition, averaged across the different experimental lists
Table 5 Gesture–targets distribution into linguistic categories in each word condition
Table 6 Categorization of gesture–targets into concrete/abstract
Table 7 Prime agreement (conventionality or meaningless agreement score), target length and frequency, according to the experimental condition, averaged across the different experimental lists
Table 8 Targets distribution into linguistic categories in each word condition

Rights and permissions

Reprints and Permissions

About this article

Cite this article

Vainiger, D., Labruna, L., Ivry, R.B. et al. Beyond words: evidence for automatic language–gesture integration of symbolic gestures but not dynamic landscapes. Psychological Research 78, 55–69 (2014). https://doi.org/10.1007/s00426-012-0475-3

Download citation

Keywords

  • Lexical Decision
  • Semantic Relatedness
  • Negative Priming
  • Lexical Decision Task
  • Naming Task