Abstract
On the basis of the work by Rose, de Vries, Barlow, and others, it is generally believed that visual detection at low background illuminances is limited by both photon and neural noise. At high background illuminances, observers are often said to depart from "ideal" performance by following Weber's law rather than the de Vries-Rose Law. This departure is sometimes ascribed to photochemical adaptation. A model is presented here in which performance is assumed to be noise-limited at all background illuminances. No photochemical adaptation is assumed. According to this model, the transition from the de Vries-Rose law to Weber's law, as background illuminance is increased, arises from stimulus-scale and intensity-dependent adjustments in visual integration time. These adjustments occur naturally as a result of the modeling of subthreshold photon summation as a random walk with drift to a fixed weighted energy threshold. The model qualitatively predicts the observed dependence of the threshold-versus-intensity curves on stimulus area and duration. The model is more consistent with the known dependence of visual-system integration time on intensity and scale than is the classical de Vries-Rose assumption of a fixed, stimulus-independent visual integration time.
© 1990 Optical Society of America
PDF ArticleMore Like This
Michael E. Rudd
MS2 OSA Annual Meeting (FIO) 1992
Michael E. Rudd
ThAA2 OSA Annual Meeting (FIO) 1991
Bruce Drum and Charles E. Sternheim
ThS2 OSA Annual Meeting (FIO) 1990