Abstract
Our brain often needs to estimate unknown variables from imperfect information. Our knowledge about the statistical distributions of quantities in our environment (called priors) and currently available information from sensory inputs (called likelihood) are the basis of all Bayesian models of perception and action. While we know that priors are learned, most studies of priorlikelihood integration simply assume that subjects know about the likelihood. However, as the quality of sensory inputs change over time, we also need to learn about new likelihoods. Here, we show that human subjects readily learn the distribution of visual cues (likelihood function) in a way that can be predicted by models of statistically optimal learning. Using a likelihood that depended on color context, we found that a learned likelihood generalized to new priors. Thus, we conclude that subjects learn about likelihood.
Original language | English |
---|---|
Article number | 13 |
Journal | Journal of vision |
Volume | 14 |
Issue number | 13 |
DOIs | |
Publication status | Published - 2014 |
Externally published | Yes |
Keywords
- Bayesian models
- Context-dependent learning
- Likelihood learning
- Sensorimotor integration
ASJC Scopus subject areas
- Ophthalmology
- Sensory Systems