Depth benefits now loading: Visual working memory capacity and benefits in 3-D

  • Dawn M. Sarno
  • Joanna E. Lewis
  • Mark B. NeiderEmail author


The present studies explored how performance in multidimensional displays varies as a function of visual working memory load, item distribution across depths, and individual capacity differences. In Experiment 1, the benefit of depth information (one depth vs. two depths) was examined across seven set sizes within a change-detection paradigm. Multiple depth planes engendered performance benefits with five items, but elicited performance decrements with three items. These effects were associated with working memory capacity, such that benefits were only observed when the working memory load exceeded an individual’s max capacity. Experiment 2 evaluated how the distribution of items in depth aids working memory performance. Equal distribution of items across depths produced higher accuracy compared with when the target was isolated in depth. Lastly, Experiment 3 explored how differences in working memory capacity affect an individual’s ability to use depth information to improve their performance. The results indicate that both low-capacity and high-capacity individuals can benefit from depth information, but this may vary as a function of working memory load. Overall, the results indicate that multidimensional displays can improve performance with sufficient working memory load, possibly through some sort of depth tag.


Visual working memory Visual short-term memory 3-D Capacity 



  1. Atchley, P., Kramer, A. F., Andersen, G. J., & Theeuwes, J. (1997). Spatial cuing in a stereoscopic display: Evidence for a ‘depth-aware’ attentional focus. Psychonomic Bulletin & Review, 4(4), 524–529.CrossRefGoogle Scholar
  2. Awh, E., Barton, B., & Vogel, E. K. (2007). Visual working memory represents a fixed number of items regardless of complexity. Psychological Science, 18(7), 622–628. doi: CrossRefGoogle Scholar
  3. Cowan, N. (2001). The magical number 4 in short-term memory: A reconsideration of mental storage capacity. Behavioral and Brain Sciences, 24(1), 87–114.CrossRefGoogle Scholar
  4. Dent, K., Braithwaite, J., He, X., & Humphreys, G. (2012). Integrating space and time in visual search: How the preview benefit it modulated by stereoscopic depth. Vision Research, 65, 45–61.CrossRefGoogle Scholar
  5. Donnelly, N., Godwin, H. J., Menneer, T., Liversedge, S. P., Cave, K. R., & Holliman, N. S. (2017). Adding depth to overlapping displays can improve visual search performance. Journal of Experimental Psychology: Human Perception And Performance, 43(8), 1532–1549.Google Scholar
  6. Downing, C. J., & Pinker, S. (1985). The spatial structure of visual attention. In M. P. O. Martin (Ed.), Attention and performance XI (pp. 171–187). Hillsdale: Erlbaum.Google Scholar
  7. Enns, J., & Rensink, R. (1990). Influence of scene based properties on visual search. Science, 247(4943), 721–723.CrossRefGoogle Scholar
  8. Faul, F., Erdfelder, E., Lang, A.-G., & Buchner, A. (2007). G*Power 3: A flexible statistical power analysis program for the social, behavioral, and biomedical sciences. Behavior Research Methods, 39, 175–191.CrossRefGoogle Scholar
  9. Finlayson, N., Remington, R., Retell, J., & Grove, P. (2013). Segmentation by depth does not always facilitate visual search. Journal of Vision, 13(8), 1–14.CrossRefGoogle Scholar
  10. Ghirardelli, T. G., & Folk, C. L. (1996). Spatial cuing in a stereoscopic display: Evidence for a “depth-blind” attentional spotlight. Psychonomic Bulletin & Review, 3(1), 81–86. doi: CrossRefGoogle Scholar
  11. Haladjian, H. H., Montemayor, C., & Pylyshyn, Z. W. (2008). Segregating targets and nontargets in depth eliminates inhibition of nontargets in multiple object tracking. Visual Cognition, 16(1), 107–110.Google Scholar
  12. Luck, S. J., & Vogel, E. K. (1997). The capacity of visual working memory for features and conjunctions. Nature, 390(6657), 279.CrossRefGoogle Scholar
  13. Miller, G. A. (1956). The magical number seven, plus or minus two: Some limits on our capacity for processing information. Psychological Review, 63(2), 81–97. doi: CrossRefGoogle Scholar
  14. Morey, R. (2008). Confidence intervals from normalized data: A correction to Cousineau (2005). Tutorials in Quantitative Methods for Psychology, 4(2), 61–64.CrossRefGoogle Scholar
  15. Peterson, D. J., & Berryhill, M. E. (2013). The Gestalt principle of similarity benefits visual working memory. Psychonomic Bulletin & Review, 20(6), 1282–1289.CrossRefGoogle Scholar
  16. Qian, J., Li, J., Wang, K., Liu, S., & Lei, Q. (2017). Evidence for the effect of depth on visual working memory. Scientific Reports, 7(1), 6408. doi: CrossRefGoogle Scholar
  17. Reeves, A., & Lei, Q. (2014). Is visual short-term memory depthful?. Vision Research, 96, 106–112. doi: CrossRefGoogle Scholar
  18. Unsworth, N., & Robison, M. K. (2015). Individual differences in the allocation of attention to items in working memory: Evidence from pupillometry. Psychonomic Bulletin & Review, 22(3), 757–765.CrossRefGoogle Scholar
  19. van Lamsweerde, A. E., & Beck, M. R. (2015). Incidental learning of probability information is differentially affected by the type of visual working memory representation. Canadian Journal of Experimental Psychology/Revue Canadienne De Psychologie Expérimentale, 69(4), 283–296. doi: CrossRefGoogle Scholar
  20. Vogel, E. K., McCollough, A. W., & Machizawa, M. G. (2005). Neural measures reveal individual differences in controlling access to working memory. Nature, 438(7067), 500–503. doi: CrossRefGoogle Scholar
  21. Woodman, G. F., Vecera, S. P., & Luck, S. J. (2003). Perceptual organization influences visual working memory. Psychonomic Bulletin & Review, 10(1), 80–87. doi: CrossRefGoogle Scholar
  22. Xu, Y. (2006). Understanding the object benefit in visual short-term memory: The roles of feature proximity and connectedness. Perception & Psychophysics, 68(5), 815–828.CrossRefGoogle Scholar
  23. Xu, Y., & Nakayama, K. (2007). Visual short-term memory benefit for objects on different 3-D surfaces. Journal of Experimental Psychology: General, 136(4), 653–662.CrossRefGoogle Scholar

Copyright information

© The Psychonomic Society, Inc. 2019

Authors and Affiliations

  • Dawn M. Sarno
  • Joanna E. Lewis
  • Mark B. Neider
    • 1
    Email author
  1. 1.Department of PsychologyUniversity of Central FloridaOrlandoUSA

Personalised recommendations