Eye Position Effect on Audio-Visual Fusion Involves New Proposals for Multimodal Interface Design

  • David Hartnage
  • Alain Bichot
  • Patrick Sandor
  • Corinne Roumes
Part of the Communications in Computer and Information Science book series (CCIS, volume 174)


Combination of audio and visual information is expected to ensure an efficient interface design for spatial information. Then, we focus on Audio-visual (AV) fusion referred to the perception of unity of audio and visual information despite there spatial disparity [1]. Previous experiment showed that AV fusion varied over space mainly with horizontal eccentricity [2]. As audio spatial information is coded in relation to head position and visual information is coded relative to eye position, question arises on eye position effect. The current psychophysical experiment investigates the effect of horizontal eye position shift on the variation of AV fusion over the 2D frontal space. Results showed that eye position affects AV fusion. Current data support the need for including eye position inputs when displaying redundant visual and auditory information in integrated multimodal interfaces. Results are discussed considering the probable effect of visual display structure.


Multimodal Interface Bimodal Stimulus Egocentric Reference Frame Fusion Limit Allocentric Reference Frame 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Bertelson, P., Radeau, M.: Cross-modal bias and perceptual fusion with auditory-visual spatial discordance. Perception & psychophysics 29(6), 578–584 (1981)CrossRefGoogle Scholar
  2. 2.
    Godfroy, M., Roumes, C., Dauchy, P.: Spatial variations of visual-auditory fusion areas. Perception 32(10), 1233–1245 (2003)CrossRefGoogle Scholar
  3. 3.
    Diederich, A.: Intersensory facilitation, vol. 369. Peter Lang, Frankfurt (1992)zbMATHGoogle Scholar
  4. 4.
    Gibson, J.J.: The senses considered as perceptual systems. Greenwood Press, Westport (1966)Google Scholar
  5. 5.
    Stein, B.E., Meredith, A.M.: The merging of the senses. The MIT press, Cambridge (1993)Google Scholar
  6. 6.
    Giard, M.-H., Peronnet, F.: Visual-auditory Integration during Multimodal Object recognition in Humans: A Behavioral and Electrophysiological Study. Journal of Cognitive Neuroscience 11(5), 473–490 (1999)CrossRefGoogle Scholar
  7. 7.
    Jack, C.E., Thurlow, W.R.: Effect of degree of visual association and angle of displacement on the ventriloquism effect. Perceptual and Motor Skills 37, 967–979 (1973)CrossRefGoogle Scholar
  8. 8.
    Paillard, J.: Motor and representational framing of space. In: Paillard, J. (ed.) Brain and space, pp. 163–182. Oxford University Press, Oxford (1991)Google Scholar
  9. 9.
    Blauert, J.: Spatial Hearing. In: The psychophysics of human sound localization, The MIT press, London (1983)Google Scholar
  10. 10.
    Pedersen, J.A., Jorgensen, T.: Localization performance of real and virtual sound sources. In: New Directions For Improving Audio Effectiveness, pp. 29.1–29.30. RTO, France (2005)Google Scholar
  11. 11.
    Perrot, D.R., Saberi, K.: Minimum audible angle thresholds for sources varying in both elevation and azimuth. Journal of the Acoustical Society of America 87, 1728–1731 (1990)CrossRefGoogle Scholar
  12. 12.
    Witkin, H.A., Asch, S.E.: Studies in space orientation. IV. Further experiments on perception of the upright with displaced visual fields. Journal of Experimental Psychology 38, 762–782 (1948)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2011

Authors and Affiliations

  • David Hartnage
    • 1
  • Alain Bichot
    • 1
  • Patrick Sandor
    • 1
  • Corinne Roumes
    • 1
  1. 1.Département Action & Cognition en Situation OpérationnelleInstitut de Recherche Biomédicale des ArméesBrétigny-sur-OrgeFrance

Personalised recommendations