Predicting visual search performance by quantifying stimuli similarities

Tamar Avraham, Yaffa Yeshurun, Michael Lindenbaum

Research output: Contribution to journalArticlepeer-review

Abstract

The effect of distractor homogeneity and target-distractor similarity on visual search was previously explored under two models designed for computer vision. We extend these models here to account for internal noise and to evaluate their ability to predict human performance. In four experiments, observers searched for a horizontal target among distractors of different orientation (orientation search; Experiments 1 and 2) or a gray target among distractors of different color (color search; Experiments 3 and 4). Distractor homogeneity and target-distractor similarity were systematically manipulated. We then tested our models' ability to predict the search performance of human observers. Our models' predictions were closer to human performance than those of other prominent quantitative models.

Original languageEnglish
Article number9
JournalJournal of Vision
Volume8
Issue number4
DOIs
StatePublished - 17 Apr 2008

Keywords

  • Heterogeneous distractions
  • Target-distractor similarity
  • Visual attention modeling
  • Visual search

ASJC Scopus subject areas

  • Ophthalmology
  • Sensory Systems

Fingerprint

Dive into the research topics of 'Predicting visual search performance by quantifying stimuli similarities'. Together they form a unique fingerprint.

Cite this