Visual similarity effects in categorical search

J Vis. 2011 Jul 14;11(8):9. doi: 10.1167/11.8.9.

Abstract

We asked how visual similarity relationships affect search guidance to categorically defined targets (no visual preview). Experiment 1 used a web-based task to collect visual similarity rankings between two target categories, teddy bears and butterflies, and random-category objects, from which we created search displays in Experiment 2 having either high-similarity distractors, low-similarity distractors, or "mixed" displays with high-, medium-, and low-similarity distractors. Analysis of target-absent trials revealed faster manual responses and fewer fixated distractors on low-similarity displays compared to high-similarity displays. On mixed displays, first fixations were more frequent on high-similarity distractors (bear = 49%; butterfly = 58%) than on low-similarity distractors (bear = 9%; butterfly = 12%). Experiment 3 used the same high/low/mixed conditions, but now these conditions were created using similarity estimates from a computer vision model that ranked objects in terms of color, texture, and shape similarity. The same patterns were found, suggesting that categorical search can indeed be guided by purely visual similarity. Experiment 4 compared cases where the model and human rankings differed and when they agreed. We found that similarity effects were best predicted by cases where the two sets of rankings agreed, suggesting that both human visual similarity rankings and the computer vision model captured features important for guiding search to categorical targets.

Publication types

  • Randomized Controlled Trial
  • Research Support, N.I.H., Extramural

MeSH terms

  • Artificial Intelligence*
  • Attention / physiology*
  • Eye Movements / physiology*
  • Fixation, Ocular / physiology
  • Form Perception / physiology*
  • Humans
  • Pattern Recognition, Visual / physiology*
  • Photic Stimulation / methods
  • Psychophysics