Font Size: a A A

Explorations of a bayesian theory of human visual search

Posted on:2011-08-07Degree:M.SType:Thesis
University:University of Colorado at BoulderCandidate:Venkatesh, KarthikFull Text:PDF
GTID:2448390002962441Subject:Engineering
Abstract/Summary:
People perform a remarkable range of tasks that require search of the visual environment for a target item among distracters, for example finding ones keys amidst the clutter of a desk. The Guided Search model or GS, is perhaps the best developed psychological account of human visual search. To prioritize search, GS assigns saliency to locations in the visual field. Saliency is a linear combination of activations from retinotopic maps representing primitive visual features. Mozer and Baldwin [10] propose a principled probabilistic formulation of GS, called Experience-Guided Search (EGS), based on a generative model of the environment. Through experience, EGS infers latent environment variables that determine the gains for guiding search. Control is thus cast as probabilistic inference, not optimization. We also propose alternate inference algorithms that are mathematically accurate than the initial model.The primary focus of our work was to test the model further by comparing predictions from the model against human behavioral data. We modeled three experiments available in literature, an unpublished experiment of Charles Wright and April Main from UC Irvine (Wright and Main [21]) and studies about priming of target and distractors (Kristjansson et al. [7] and Kristjansson and Driver [6]). We conducted simulations of EGS with an appropriate trial sequences, and found that EGS yields results consistent with the human data. What is remarkable in all of our simulations is that the model has essentially no free parameters, so its predictions are completely constrained by the sequence of trials in an experiment, making it a strongly predictive theory.
Keywords/Search Tags:Search, Visual, Human, EGS
Related items