A virtual reality approach identifies flexible inhibition of motion aftereffects induced by head rotation
As we move in space, our retinae receive motion signals from two causes: those resulting from motion in the world and those resulting from self-motion. Mounting evidence has shown that vestibular self-motion signals interact with visual motion processing profoundly. However, most contemporary methods arguably lack portability and generality and are incapable of providing measurements during locomotion. Here we developed a virtual reality approach, combining a three-space sensor with a head-mounted display, to quantitatively manipulate the causality between retinal motion and head rotations in the yaw plane. Using this system, we explored how self-motion affected visual motion perception, particularly the motion aftereffect (MAE). Subjects watched gratings presented on a head-mounted display. The gratings drifted at the same velocity as head rotations, with the drifting direction being identical, opposite, or perpendicular to the direction of head rotations. We found that MAE lasted a significantly shorter time when subjects’ heads rotated than when their heads were kept still. This effect was present regardless of the drifting direction of the gratings, and was also observed during passive head rotations. These findings suggest that the adaptation to retinal motion is suppressed by head rotations. Because the suppression was also found during passive head movements, it should result from visual–vestibular interaction rather than from efference copy signals. Such visual–vestibular interaction is more flexible than has previously been thought, since the suppression could be observed even when the retinal motion direction was perpendicular to head rotations. Our work suggests that a virtual reality approach can be applied to various studies of multisensory integration and interaction.
KeywordsHead movement Adaptation Motion aftereffect Multisensory Virtual reality
- DeAngelis, G. C., & Angelaki, D. E. (2012). Visual–vestibular integration for self-motion perception. In M. M. Murray & M. T. Wallace (Eds.), The neural bases of multisensory processes (pp. 629–650). Boca Raton: CRC Press.Google Scholar
- Hebb, D. O. (1949). The organization of behavior. New York: Wiley.Google Scholar
- Petrov, A. A., & Van Horn, N. M. (2012). Motion aftereffect duration is not changed by perceptual learning: Evidence against the representation modification hypothesis. Vision Research, 61, 4–14. https://doi.org/10.1016/j.visres.2011.08.005
- Verstraten, F. A. J., Fredericksen, R. E., & van de Grind, W. A. (1994). Movement aftereffect of bi-vectorial transparent motion. Vision Research, 34, 349–358. https://doi.org/10.1016/0042-6989(94)90093-0
- Wallach, H. (1987). Perceiving a stable environment when one moves. Annual Review of Psychology, 38, 1–27. https://doi.org/10.1146/annurev.ps.38.020187.000245 CrossRefGoogle Scholar