Abstract
We are investigating a new approach to the problem of estimating an unknown signal—use of the criterion of maximum information (MI). Given a set of data describing the signal (say, its noisy detected values or its autocorrelation), that signal is sought that would have relayed a maximum of Shannon information about itself into the data. This constitutes a maximally optimistic, unexpected, or detailed estimate, as is discussed. The MI principle is shown to be applicable to optical problems when enough photons are present to characterize the optical signals as probability laws. The image-restoration, image- or signal-smoothing, and missing-phase problems are so addressed. Sufficiently high signal levels are assumed present that noise of the additive type dominates in the image data. We analytically show that (a) under certain conditions the MI estimate is asymptotic to the maximum entropy estimate, and (b) with small noise in the data, the maximized information can be estimated immediately from the data without having to solve explicitly for the unknown signal. Numerical examples are given.
© 1981 Optical Society of America
Full Article | PDF ArticleMore Like This
B. Roy Frieden and Donald C. Wells
J. Opt. Soc. Am. 68(1) 93-103 (1978)
B. Roy Frieden
J. Opt. Soc. Am. 62(4) 511-518 (1972)
H. Stark, D. Cahana, and H. Webb
J. Opt. Soc. Am. 71(6) 635-642 (1981)