Cross-Modal Learning in the Auditory System
- 443 Downloads
Unisensory auditory representations are strongly shaped by multisensory experience, and, likewise, audition contributes to cross-modal learning in other sensory systems. This applies to lower-level sensory features like spatial and temporal processing as well as to higher-level features like speech identification. Cross-modal learning has particularly profound influences during development, but its effects on unisensory processing are ubiquitous throughout life. Moreover, influences of cross-modal learning on unisensory processing have been observed at various timescales, ranging from long-term structural changes over months to short-term plasticity of auditory representations after minutes or only seconds of cross-modal exposure. This chapter focuses particularly on cross-modal learning and its underlying neural mechanisms in the healthy adult auditory system. Recent findings suggest that cross-modal learning operates in parallel on different neural representations and at different timescales. With an increasing amount of exposure to new cross-modal associations, cross-modal learning seems to progress from higher level multisensory representations to lower level modality-specific representations, possibly even in primary auditory cortex. In addition to cortically mediated learning mechanisms, auditory representations are shaped via subcortical multisensory pathways including the superior colliculi in the midbrain. The emerging view from these findings is that auditory-guided behavior is jointly shaped by cross-modal learning in distinct neural systems. To fully understand the dynamic nature of the auditory system, it will be important to identify how short-term and long-term learning processes interact in the mature brain.
KeywordsAttention Audiovisual Multisensory Plasticity Recalibration Sensory representations Space Time Ventriloquism aftereffect Visual system
The authors’ work was supported by German Research Foundation (DFG) Grants BR 4913/2-1 and TRR 169 Subproject A1 and the City of Hamburg Grant “Crossmodal Learning.”
Compliance with Ethics Requirements
Patrick Bruns declares that he has no conflict of interest.
Brigitte Röder declares that she has no conflict of interest.
- Bertelson, P., & Aschersleben, G. (1998). Automatic visual bias of perceived auditory location. Psychonomic Bulletin & Review, 5, 482–489.Google Scholar
- Bertelson, P., Vroomen, J., de Gelder, B., & Driver, J. (2000). The ventriloquist effect does not depend on the direction of deliberate visual attention. Perception & Psychophysics, 62, 321–332.Google Scholar
- Bertelson, P., Frissen, I., Vroomen, J., & de Gelder, B. (2006). The aftereffects of ventriloquism: Patterns of spatial generalization. Perception & Psychophysics, 68, 428–436.Google Scholar
- Bruns, P., Maiworm, M., & Röder, B. (2014). Reward expectation influences audiovisual spatial integration. Attention, Perception, & Psychophysics, 76, 1815–1827.Google Scholar
- Caclin, A., Soto-Faraco, S., Kingstone, A., & Spence, C. (2002). Tactile “capture” of audition. Perception & Psychophysics, 64, 616–630.Google Scholar
- Chen, L., & Vroomen, J. (2013). Intersensory binding across space and time: A tutorial review. Attention, Perception, & Psychophysics, 75, 790–811.Google Scholar
- Gutfreund, Y., & King, A. J. (2012). What is the role of vision in the development of the auditory space map? In B. E. Stein (Ed.), The New Handbook of Multisensory Processing (pp. 573–587). Cambridge, MA: The MIT Press.Google Scholar
- King, A. J. (2009). Visual influences on auditory spatial learning. Philosophical Transactions of the Royal Society, B: Biological Sciences, 364, 331–339.Google Scholar
- Lewald, J. (2002). Rapid adaptation to auditory-visual spatial disparity. Learning & Memory, 9, 268–278.Google Scholar
- Maiworm, M., Bellantoni, M., Spence, C., & Röder, B. (2012). When emotional valence modulates audiovisual integration. Attention, Perception, & Psychophysics, 74, 1302–1311.Google Scholar
- Radeau, M., & Bertelson, P. (1977). Adaptation to auditory-visual discordance and ventriloquism in semirealistic situations. Perception & Psychophysics, 22, 137–146.Google Scholar
- Radeau, M., & Bertelson, P. (1978). Cognitive factors and adaptation to auditory-visual discordance. Perception & Psychophysics, 23, 341–343.Google Scholar
- Tzounopoulos, T., & Leão, R. M. (2012). Mechanisms of memory and learning in the auditory system. In L. O. Trussel, A. N. Popper, & R. R. Fay (Eds.), Synaptic Mechanisms in the Auditory System (pp. 203–226). New York: Springer-Verlag.Google Scholar
- Von Kriegstein, K., & Giraud, A.-L. (2006). Implicit multisensory associations influence voice recognition. PLoS Biology, 4, e326.Google Scholar
- Vroomen, J., & Keetels, M. (2010). Perception of intersensory synchrony: A tutorial review. Attention, Perception, & Psychophysics, 72, 871–884.Google Scholar
- Vroomen, J., Bertelson, P., & de Gelder, B. (2001). The ventriloquist effect does not depend on the direction of automatic visual attention. Perception & Psychophysics, 63, 651–659.Google Scholar