Advertisement

Brain Electric Microstate and Perception of Simultaneously Audiovisual Presentation

  • Wichian Sittiprapaporn
  • Jun Soo Kwon
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5768)

Abstract

Associations between picture and sound form the basis of reading. Learning the correspondences between them is a crucial step in reading acquisition. This study was designed to investigate whether task-related processing of audio and visual features was independent or task-related processing in one modality might influence the processing of the other. The present study employed simultaneous audio-visual stimulus in the oddball paradigm to re-examine the effects of attention on audio, visual and audio-visual perception in the non-musician brain. Electroencephalographic (EEG) was recorded from 28 normal participants. None of them had more than three years of formal musical training and none had any musical training within the past five years. Chinese and Korean subjects were presented with tones (auditory: A), pictures (visual: V), and simultaneous tones and pictures (audio-visual: AV). The neural basis of this interaction was investigated by subtracting the event-related potentials (ERPs) to the A and the V stimuli alone from the ERP to the combined AV stimuli (i.e. interaction = AV - (A+V)). The Korean group showed larger mean interaction amplitude and longer in time than the Chinese group. This reveals that experience influences the early cortical automatic processing of linguistically relevant suprasegmental pitch contour. These results suggest that efficient processing of associations between pictures and sounds relies on neural mechanisms similar to those naturally evolved for integrating audiovisual perception.

Keywords

Brain Audiovisual perception Cognition Event-related potential (ERP) Microstate 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Tervaniemi, M., Rytkönen, M., Schröger, E., Ilmoniemi, R., Näätänen, R.: Superior formation of cortical memory traces for melodic patterns in musicians. Learning & Memory 8, 295–300 (2001)CrossRefGoogle Scholar
  2. 2.
    Schröger, E., Tervaniemi, M., Wolff, C., Näätänen, R.: Preattentive periodicity detection in auditory patterns as governed by time and intensity information. Brain Res. Cogn. Brain Res. 4, 145–148 (1996)CrossRefGoogle Scholar
  3. 3.
    Alain, C., Woods, D.L., Ogawa, K.H.: Brain indices of automatic pattern processing. NeuroReport 6, 140–144 (1994)CrossRefGoogle Scholar
  4. 4.
    Alho, K., Tervaniemi, M., Huotilainen, M., Lavikainen, J., Tiitinen, H., Ilmoniemi, R.J., Knuutila, J., Näätänen, R.: Processing of complex sounds in the human auditory cortex as revealed by magnetic brain responses. Psychophysiology 33, 396–375 (1996)Google Scholar
  5. 5.
    Scherg, M., Vajsar, J., Picton, T.W.: A source analysis of the late human auditory evoked potentials. J. Cogn. Neurosci. 5, 363–370 (1989)Google Scholar
  6. 6.
    Näätänen, R.: Attention and Brain Function. Erlbaum, Hillsdale (1992)Google Scholar
  7. 7.
    Nyman, G., Alho, K., Iaurinen, P., Paavilainen, P., Radil, T., Rainikainen, K.: Mismatch negativity (MMN) for sequences of auditory and visual stimuli: evidence for a mechanism specific to the auditory modality. Electroenceph. Clin. Neurophys. 77, 436–444 (1990)CrossRefGoogle Scholar
  8. 8.
    Brattico, E., Winkler, I., Näätänen, R., Paavilainen, P., Tervaniemi, M.: Simultaneous storage of two complex temporal sound patterns in auditory sensory memory. NeuroReport 13, 1747–1751 (2002)CrossRefGoogle Scholar
  9. 9.
    Ritter, W., Deacon, D., Gomes, H., Javitt, D.C., Vaughan Jr., H.G.: The mismatch negativity of event-related potentials as a probe of transient auditory memory: a review. Ear Hear 16, 52–67 (1995)CrossRefGoogle Scholar
  10. 10.
    Tales, A., Newton, P., Troscianko, T., Butler, S.: Mismatch negativity in the visual modality. NeuroReport 10, 3363–3367 (1999)CrossRefGoogle Scholar
  11. 11.
    Näätänen, R.: The role attention in auditory information processing as revealed by event-related potentials and other brain measures of cognitive function. Behav. Brain Sci. 13, 201–288 (1990)CrossRefGoogle Scholar
  12. 12.
    Czigler, I., Balázs, L., Pató, L.G.: Visual change detection: event-related potentials are dependent on stimulus location in humans. Neurosci. Lett. 364(3), 149–153 (2004)CrossRefGoogle Scholar
  13. 13.
    Yucel, G., Pettv, C., McCarthv, G., Belger, A.: Graded visual attention modulates brain response evoked by task-irrelevant auditory pitch changes. J. Cogn. Neurosci. 17(2), 1819–1828 (2005)CrossRefGoogle Scholar
  14. 14.
    Czigler, I., Balazs, L., Winkler, I.: Memory-based detection of task-irrelevant visual changes. Psychophysiology 39, 869–873 (2002)CrossRefGoogle Scholar
  15. 15.
    Heslenfeld, D.J.: Visual mismatch negativity. In: Polish, J. (ed.) Detection of change: event-related potential and fMRI findings, pp. 41–60. Kluwer Academic Publishers, Dordrecht (2003)CrossRefGoogle Scholar
  16. 16.
    Cammann, R.: Is there a mismatch negativity (MMN) in the visual modality? Behv. Brain Sci. 13, 234–235 (1999)CrossRefGoogle Scholar
  17. 17.
    Pazo-Alvarez, P., Cadaveira, F., Amenedo, E.: MMN in the visual modality: a review. Biol. Psychology 63, 1999–1236 (2003)Google Scholar
  18. 18.
    Woods, D.J., Alho, K., Algazi, A.: Intermodal selective attention. I. Effects on event-related potentials to lateralized auditory and visual stimuli. Electroenceph. Clin. Neurophys. 82, 341–355 (1992)CrossRefGoogle Scholar
  19. 19.
    Alho, K., Woods, D.L., Algazi, A., Näätänen, R.: Intermodal selective attention. II. Effects of attentional load on processing of auditory and visual stimuli in central space. Electroenceph. Clin. Neurophys. 82, 356–368 (1992)CrossRefGoogle Scholar
  20. 20.
    Smid, H.G.O.M., Jakob, A., Heinze, H.J.: An event-related brain potential study of visual selective attention to conjunctions of color and shape. Psychophysiology 36, 264–279 (1999)CrossRefGoogle Scholar
  21. 21.
    Wackermann, J., Lehmann, D., Michel, C.M., Strik, W.K.: Adaptive segmentation of spontaneous EEG map series into spatially defined microstates. Int. J. Psychophys. 14, 269–283 (1993)CrossRefGoogle Scholar
  22. 22.
    Lehmann, D., Skrandies, W.: Segmentation of evoked potentials based on spatial field configuration in multichannel recordings. In: McCallum, W.C., Zappoli, R., Denoth, F. (eds.) Cerebral Psychophysiology: Studies in Event-Related Potentials. Electroencephalography and Clinical Neurophysiology (suppl. 38), pp. 506–512 (1986)Google Scholar
  23. 23.
    Klucharev, V., Möttönen, R., Sams, M.: Electrophysiological indicators of phonetic and non-phonetic multisensory interactions during audiovisual speech perception. Brain Res. Cog. Brain Res. 18, 65–75 (2003)CrossRefGoogle Scholar
  24. 24.
    Möttönen, R., Schurmann, M., Sams, M.: Time course of multisensory interactions during audiovisual speech perception in humans: a magnetoencephalographic study. Neurosci. Let. 363, 112–115 (2004)CrossRefGoogle Scholar
  25. 25.
    Sams, M., Aulanko, R., Hämäläinen, M., Haru, R., Lounasmaa, O.V., Lu, S.-T.: Seeing speech: visual information from lip movements modifies activity in the human auditory cortex. Neurosci. Let. 127, 141–145 (1991)CrossRefGoogle Scholar
  26. 26.
    Besle, J., Fort, A., Delpuech, C., Giard, M.H.: Bimodal speech: early suppressive visual effects in human auditory cortex. Euro. J. Neurosci. 20, 2225–2234 (2004)CrossRefGoogle Scholar
  27. 27.
    Calvert, G.A., Thesen, T.: Multisensory integration: methodological approaches and emerging principles in the human brain. J. Phys. Paris 98, 191–205 (2004)CrossRefGoogle Scholar
  28. 28.
    van Wassenhove, V., Grant, K.W., Poeppel, D.: Visual speech speeds up the neural processing of auditory speech. Proc. Natl. Acad. Sci. USA 102, 1181–1186 (2005)CrossRefGoogle Scholar
  29. 29.
    Foxe, J.J., Morocz, I.A., Murray, B.A., Higgins, B.A., Javitt, D.C., Schroeder, C.E.: Multisensory auditory-somatosensory interactions in early cortical processing revealed by high-density electrical mapping. Cog. Brain Res. 10, 77–83 (2000)CrossRefGoogle Scholar
  30. 30.
    Giard, M.H., Peronnet, F.: Audio-visual integration during multimodal object recognition in humans: a behavioral and electrophysiological study. J. Cog. Neurosci. 1, 473–490 (1999)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2009

Authors and Affiliations

  • Wichian Sittiprapaporn
    • 1
  • Jun Soo Kwon
    • 2
    • 3
    • 4
  1. 1.College of MusicMahidol University, SalayaNakhonpathomThailand
  2. 2.Department of PsychiatrySeoul National University College of MedicineSeoulKorea
  3. 3.Clinical Research CenterSeoul National University HospitalSeoulKorea
  4. 4.Brain Korea 21 Human Life ScienceSeoul National UniversitySeoulKorea

Personalised recommendations