Abstract
Signal detection theory is an approach to predicting imaging system performance through the calculation of the performance of an ideal or Bayesian observer for some model task.1 By calculating the ideal-observer performance over a range of system design parameters, the effect of those parameters on image quality can be quantified. Implicit in this approach to image assessment is the assumption that ideal performance is relevant to the performance of the human observer, so that a system optimized for the ideal observer will likewise be optimized for the human observer, given an appropriate display strategy. Also implicit is the assumption that the model task is representative of the tasks that are to be performed in the "real world".
© 1989 Optical Society of America
PDF ArticleMore Like This
Dov Sagi and Barton S. Rubenstein
WCC2 OSA Annual Meeting (FIO) 1989
Kevin A. Gross and Matthew A. Kupinski
FThM2 Frontiers in Optics (FiO) 2005
H. H. Barrett, J. N. Aarsvold, T. J. Roney, and R. K. Rowe
TuC1 Quantum-Limited Imaging and Image Processing (QLIP) 1989