Abstract
A principle I–J = min. of minimum Fisher information I = ∫dxp'2(x)/p(x) is often used to estimate an unknown probability density function p(x) in the presence of insufficient information J. The latter is conventionally a Lagrange constraint term. The resulting estimate p(x) may or may not be correct (where there is a "correct" answer), although it is smooth. But suppose the correct answer is required and the user has the resources to choose the particular information J to input. What form J should be chosen, and what form should I – J ≡K take, where K is the "physical" information? The choice of J (and K) follows from three reasonable axioms: (i) K is linear in I, i.e., K measures disorder, (ii) K is a universal value, irrespective of physical phenomenon, i.e., all phenomena p(x) are equivalent in their information content; (iii) there is no preferred space, direct or Fourier, for representing K. The solution is that K = 0 or, numerically, J = 1. However, functionally J depends upon p(x) differently than does I; this dependence, which often arises out of Parseval's theorem, is the required input of information. We show that many of the known laws p(x) of physics may be "estimated" in this way.
© 1992 Optical Society of America
PDF ArticleMore Like This
B. Roy Frieden
MW1 OSA Annual Meeting (FIO) 1987
B. Roy Frieden
MN6 OSA Annual Meeting (FIO) 1989
B.Roy Frieden and José Celaya
IWA1 Integrated Computational Imaging Systems (ICIS) 2001