Advertisement

Lightness/pitch and elevation/pitch crossmodal correspondences are low-level sensory effects

  • Mick ZeljkoEmail author
  • Ada Kritikos
  • Philip M Grove
Article

Abstract

We tested the sensory versus decisional origins of two established audiovisual crossmodal correspondences (CMCs; lightness/pitch and elevation/pitch), applying a signal discrimination paradigm to low-level stimulus features and controlling for attentional cueing. An audiovisual stimulus randomly varied along two visual dimensions (lightness: black/white; elevation: high/low) and one auditory dimension (pitch: high/low), and participants discriminated either only lightness, only elevation, or both lightness and elevation. The discrimination task and the stimulus duration varied between subjects. To investigate the influence of crossmodal congruency, we considered the effect of each CMC (lightness/pitch and elevation/pitch) on the sensitivity and criterion of each discrimination as a function of stimulus duration. There were three main findings. First, discrimination sensitivity was significantly higher for visual targets paired congruently (compared with incongruently) with tones while criterion was unaffected. Second, the sensitivity increase occurred for all stimulus durations, ruling out attentional cueing effects. Third, the sensitivity increase was feature specific such that only the CMC that related to the feature being discriminated influenced sensitivity (i.e. lightness congruency only influenced lightness discrimination and elevation congruency only influenced elevation discrimination in the single and dual task conditions). We suggest that these congruency effects reflect low-level sensory processes.

Keywords

Crossmodal correspondence Multisensory integration Signal detection theory 

Notes

References

  1. Alais, D., & Burr, D. (2004). The ventriloquist effect results from near-optimal bimodal integration. Current Biology, 14(3), 257–262. doi:  https://doi.org/10.1016/j.cub.2004.01.029
  2. Andersen, S. K., & Müller, M. M. (2010). Behavioural performance follows the time course of neural facilitation and suppression during cued shifts of feature-selective attention. Proceedings of the National Academy of Sciences, 107(31), 13878–13882.Google Scholar
  3. Andersen, T. S., & Mamassian, P. (2008). Audiovisual integration of stimulus transients. Vision Research, 48(25), 2537–2544.Google Scholar
  4. Bernstein, I. H., & Edelstein, B. A. (1971). Effects of some variations in auditory input upon visual choice reaction time. Journal of Experimental Psychology, 87(2), 241–247.Google Scholar
  5. Bien, N., ten Oever, S., Goebel, R., & Sack, A. T. (2012). The sound of size: Crossmodal binding in pitch-size synaesthesia: A combined TMS, EEG and psychophysics study. NeuroImage, 59(1), 663–672.Google Scholar
  6. Brainard, D. H. (1997). The Psychophysics Toolbox. Spatial Vision, 10(4), 433–436.Google Scholar
  7. Carrasco, M. (2011). Visual attention: The past 25 years. Vision Research, 51(13), 1484–1525.Google Scholar
  8. Chiou, R., & Rich, A. N. (2012). Crossmodality correspondence between pitch and spatial location modulates attentional orienting. Perception, 41(3), 339–353.Google Scholar
  9. Diaconescu, A. O., Alain, C., & McIntosh, A. R. (2011). The co-occurrence of multisensory facilitation and crossmodal conflict in the human brain. Journal of Neurophysiology, 106(6), 2896–2909.Google Scholar
  10. Ernst, M. O., & Bülthoff, H. H. (2004). Merging the senses into a robust percept. Trends in Cognitive Sciences, 8(4), 162–169.Google Scholar
  11. Evans, K. K., & Treisman, A. (2010). Natural cross-modal mappings between visual and auditory features. Journal of Vision, 10(1), 6–6. doi: https://doi.org/10.1167/10.1.6 Google Scholar
  12. Gallace, A., & Spence, C. (2006). Multisensory synesthetic interactions in the speeded classification of visual size. Perception & Psychophysics, 68(7), 1191–1203.Google Scholar
  13. Gau, R., & Noppeney, U. (2016). How prior expectations shape multisensory perception. NeuroImage, 124, 876–886.Google Scholar
  14. Giard, M. H., & Peronnet, F. (1999). Auditory-visual integration during multimodal object recognition in humans: A behavioural and electrophysiological study. Journal of Cognitive Neuroscience, 11(5), 473–490.Google Scholar
  15. Green, D. M., & Swets, J. A. (1966). Signal detection theory and psychophysics. New York: Wiley.Google Scholar
  16. Hidaka, S., Teramoto, W., Keetels, M., & Vroomen, J. (2013). Effect of pitch–space correspondence on sound-induced visual motion perception. Experimental Brain Research, 231(1), 117–126.Google Scholar
  17. Kandel, E. (2013). Principles of neural science (5th). New York: McGraw-Hill Medical.Google Scholar
  18. Klapetek, A., Ngo, M. K., & Spence, C. (2012). Does crossmodal correspondence modulate the facilitatory effect of auditory cues on visual search? Attention, Perception, & Psychophysics, 74(6), 1154–1167.Google Scholar
  19. Kleiner, M., Brainard, D., Pelli, D., Ingling, A., Murray, R., & Broussard, C. (2007). What’s new in Psychtoolbox-3. Perception, 36(14), 1.Google Scholar
  20. Kording, K. P., Beierholm, U., Ma, W. J., Quartz, S., Tenenbaum, J. B., & Shams, L. (2007). Causal inference in multisensory perception. PLOS ONE, 2(9), e943.Google Scholar
  21. Macmillan, N. A., & Creelman, C. D. (2005). Detection theory (2nd). London: Erlbaum.Google Scholar
  22. Marks, L. E. (1987). On crossmodal similarity: Auditory–visual interactions in speeded discrimination. Journal of Experimental Psychology: Human Perception and Performance, 13(3), 384.Google Scholar
  23. Marks, L. E., Ben-Artzi, E., & Lakatos, S. (2003). Crossmodal interactions in auditory and visual discrimination. International Journal of Psychophysiology, 50(1), 125–145.Google Scholar
  24. McDonald, J. J., Green, J. J., Störmer, V. S., & Hillyard, S. A. (2012). Cross-modal spatial cueing of attention influences visual perception. In M. M. Murray & M. T. Wallace (Eds.), The neural bases of multisensory processes. Boca Raton: CRC Press/Taylor & Francis.Google Scholar
  25. McDonald, J. J., Teder-Sälejärvi, W. A., & Hillyard, S. A. (2000). Involuntary orienting to sound improves visual perception. Nature, 407(6806), 906–908.Google Scholar
  26. Melara, R. D., & O’Brien, T. P. (1987). Interaction between synesthetically corresponding dimensions. Journal of Experimental Psychology: General, 116(4), 323–336.Google Scholar
  27. Meredith, M. A., Nemitz, J. W., & Stein, B. E. (1987). Determinants of multisensory integration in superior colliculus neurons: I. Temporal factors. Journal of Neuroscience, 7(10), 3215–3229.Google Scholar
  28. Meredith, M. A., & Stein, B. E. (1983). Interactions among converging sensory inputs in the superior colliculus. Science, 221(4608), 389–391.Google Scholar
  29. Meredith, M. A., & Stein, B. E. (1986). Spatial factors determine the activity of multisensory neurons in cat superior colliculus. Brain Research, 365(2), 350–354.Google Scholar
  30. Molholm, S., Ritter, W., Murray, M. M., Javitt, D. C., Schroeder, C. E., & Foxe, J. J. (2002). Multisensory auditory–visual interactions during early sensory processing in humans: A high-density electrical mapping study. Cognitive Brain Research, 14(1), 115–128.Google Scholar
  31. Mossbridge, J. A., Grabowecky, M., & Suzuki, S. (2011). Changes in auditory frequency guide visual–spatial attention. Cognition, 121(1), 133–139.Google Scholar
  32. Odgaard, E. C., Arieh, Y., & Marks, L. E. (2003). Cross-modal enhancement of perceived brightness: Sensory interaction versus response bias. Perception & Psychophysics, 65(1), 123-132.Google Scholar
  33. Parise, C., & Spence, C. (2008). Synesthetic congruency modulates the temporal ventriloquism effect. Neuroscience Letters, 442(3), 257–261.Google Scholar
  34. Parise, C., & Spence, C. (2013). Audiovisual crossmodal correspondences in the general population. In J. Simner & E. M. Hubbard (Eds.), Oxford handbook of synesthesia. Oxford: Oxford University Press.Google Scholar
  35. Parise, C. V. (2016). Crossmodal correspondences: Standing issues and experimental guidelines. Multisensory Research, 29(1/3), 7–28.Google Scholar
  36. Parise, C. V., Harrar, V., Ernst, M. O., & Spence, C. (2013). Cross-correlation between auditory and visual signals promotes multisensory integration. Multisensory Research, 26(3), 307–316.Google Scholar
  37. Parise, C. V., Knorre, K., & Ernst, M. O. (2014). Natural auditory scene statistics shapes human spatial hearing. Proceedings of the National Academy of Sciences, 111(16), 6104–6108.Google Scholar
  38. Parise, C. V., & Spence, C. (2009). ‘When birds of a feather flock together’: Synesthetic correspondences modulate audiovisual integration in non-synesthetes. PLOS ONE, 4(5), e5664.Google Scholar
  39. Parise, C. V., & Spence, C. (2012). Audiovisual crossmodal correspondences and sound symbolism: a study using the implicit association test. Experimental Brain Research, 220(3/4), 319–333.Google Scholar
  40. Posner, M. I. (1980). Orienting of attention. Quarterly Journal of Experimental Psychology, 32(1), 3–25.Google Scholar
  41. Posner, M. I., & Cohen, Y. (1984). Components of visual orienting. Attention and Performance X: Control of Language Processes, 32, 531–556.Google Scholar
  42. Raij, T., Ahveninen, J., Lin, F. H., Witzel, T., Jääskeläinen, I. P., Letham, B., & Hämäläinen, M. (2010). Onset timing of cross-sensory activations and multisensory interactions in auditory and visual sensory cortices. European Journal of Neuroscience, 31(10), 1772–1782.Google Scholar
  43. Spence, C. (2011). Crossmodal correspondences: A tutorial review. Attention, Perception, & Psychophysics, 73(4), 971–995.Google Scholar
  44. Spence, C., & Deroy, O. (2013). How automatic are crossmodal correspondences? Consciousness and Cognition, 22(1), 245–260.Google Scholar
  45. Störmer, V. S., McDonald, J. J., & Hillyard, S. A. (2009). Crossmodal cueing of attention alters appearance and early cortical processing of visual stimuli. Proceedings of the National Academy of Sciences, 106(52), 22456–22461.Google Scholar
  46. Thelen, A., & Murray, M. M. (2013). The efficacy of single-trial multisensory memories. Multisensory Research, 26(5), 483–502.Google Scholar
  47. van Ede, F., de Lange, F. P., & Maris, E. (2012). Attentional cues affect accuracy and reaction time via different cognitive and neural processes. Journal of Neuroscience, 32(30), 10408–10412.Google Scholar
  48. Van Wanrooij, M. M., Bremen, P., & John Van Opstal, A. (2010). Acquired prior knowledge modulates audiovisual integration. European Journal of Neuroscience, 31(10), 1763–1771.Google Scholar
  49. Walker, L., & Walker, P. (2016). Cross-sensory mapping of feature values in the size–brightness correspondence can be more relative than absolute. Journal of Experimental Psychology: Human Perception and Performance, 42(1), 138.Google Scholar
  50. Watkins, S., Shams, L., Josephs, O., & Rees, G. (2007). Activity in human V1 follows multisensory perception. NeuroImage, 37(2), 572–578.Google Scholar
  51. Watkins, S., Shams, L., Tanaka, S., Haynes, J. D., & Rees, G. (2006). Sound alters activity in human V1 in association with illusory visual perception. NeuroImage, 31(3), 1247–1256.Google Scholar
  52. Witt, J. K., Taylor, J. E. T., Sugovic, M., & Wixted, J. T. (2015). Signal detection measures cannot distinguish perceptual biases from response biases. Perception, 44(3), 289–300.Google Scholar
  53. Zeljko, M., & Grove, P. M. (2017). Sensitivity and bias in the resolution of stream-bounce stimuli. Perception, 46(2), 178–204.Google Scholar

Copyright information

© The Psychonomic Society, Inc. 2019

Authors and Affiliations

  1. 1.School of PsychologyThe University of QueenslandSt. LuciaAustralia

Personalised recommendations