Visual search asymmetry depends on target-distractor feature similarity: Is the asymmetry simply a result of distractor rejection speed?
Previous studies have shown that in visual search, varying the target and distractor familiarity produces a search asymmetry: Detecting a novel target among familiar distractors is more efficient than detecting a familiar target among novel distractors. One explanation is that novel targets have enhanced salience and are detected preattentively. Conversely, familiar distractors may be easier to reject. The current study postulates that target–distractor feature similarity, in addition to target or distractor familiarity, is a key determinant of visual search efficiency. The results of two experiments reveal that visual search is more efficient when distractors are familiar regardless of target familiarity, but only when the target–distractor similarity is high. When similarity is low, the visual search asymmetry disappears and the search times become highly efficient, with search slopes not different from zero regardless of target or distractor familiarity. However, although distractor familiarity plays an important role in inducing the search asymmetry, comparisons of search efficiency in target-present and target-absent trials reveal that search asymmetries cannot be explained solely by the faster speed of rejecting familiar distractors, as proposed by previous studies. Rather, distractor familiarity influences processes outside of stimulus selection, such as search monitoring and termination decisions. Competition among bottom-up item salience effects and top-down shape recognition processes is proposed to account for these findings.
KeywordsVisual search Asymmetry Familiarity Similarity Target Distractor Feature
- Buetti, S., Cronin, D. A., Madison, A. M., Wang, Z., & Lleras, A. (2016). Towards a better understanding of parallel visual processing in human vision? Evidence for exhaustive analysis of visual information. Journal of Experimental Psychology: General, 145, 672–707. https://doi.org/10.1037/xge0000163 CrossRefGoogle Scholar
- Hout, M. C., & Goldinger, S. D. (2012). Incidental learning speeds visual search by lowering response thresholds, not by improving efficiency: Evidence from eye movements. Journal of Experimental Psychology: Human Perception & Performance, 38(1), 90–112. https://doi.org/10.1037/a0023894 CrossRefGoogle Scholar
- Kunar, M. A., Flusberg, S., Horowitz, T. S., & Wolfe, J. M. (2007). Does contextual cuing guide the deployment of attention? Journal of Experimental Psychology: Human Perception & Performance, 33, 816–828.Google Scholar
- Luck, S. J., & Vogel, E. K. (1997). The capacity of visual working memory for features and conjunctions. Nature, 390, 279–281.Google Scholar
- Qin, X. A., Koutstaal, W., & Engel, S. A. (2014). The hard-won benefits of familiarity in visual search: Naturally familiar brand logos are found faster. Attention, Perception, & Psychophysics, 76, 914–930. https://doi.org/10.3758/APP.72.5.1267
- Reicher, G. M., Snyder, C. R. R., & Richards, J. T. (1976). Familiarity of background characters in visual scanning. Journal of Experimental Psychology: Human Perception & Performance, 2, 522–530.Google Scholar
- Saiki, J. (2008). Stimulus-driven mechanisms underlying visual search asymmetry revealed by classification image analyses. Journal of Vision, 8(4), 30, 1–19.Google Scholar
- Wolfe, J. M., Cave, K. R., & Franzel, S. L. (1989). Guided search: An alternative to the feature integration model for visual search. Journal of Experimental Psychology: Human Perception & Performance, 15, 419–433.Google Scholar
- Wolfe, J. M., Oliva, A., Horowitz, T. S., Butcher, S. J., & Bompas, A. (2012). Segmentation of objects from backgrounds in visual search tasks. Vision Research, 42, 2985–3004.Google Scholar