Abstract
Human observers localize events in the world by using sensory signals from multiple modalities. We evaluated two theories of spatial localization that predict how visual and auditory information are weighted when these signals specify different locations in space. According to one theory (visual capture), the signal that is typically most reliable dominates in a winner-take-all competition, whereas the other theory (maximum-likelihood estimation) proposes that perceptual judgments are based on a weighted average of the sensory signals in proportion to each signal’s relative reliability. Our results indicate that both theories are partially correct, in that relative signal reliability significantly altered judgments of spatial location, but these judgments were also characterized by an overall bias to rely on visual over auditory information. These results have important implications for the development of cue integration and for neural plasticity in the adult brain that enables humans to optimally integrate multimodal information.
© 2003 Optical Society of America
Full Article | PDF ArticleMore Like This
Tai Sing Lee and David Mumford
J. Opt. Soc. Am. A 20(7) 1434-1448 (2003)
Steven S. Shimozaki, Miguel P. Eckstein, and Craig K. Abbey
J. Opt. Soc. Am. A 20(12) 2197-2215 (2003)
Michael S. Landy and Haruyuki Kojima
J. Opt. Soc. Am. A 18(9) 2307-2320 (2001)