Abstract
The signal-to-noise ratio (S/N) of the Brown–Twiss stellar intensity interferometer, in which the resolving time (τr) of the photodetectors is much longer than the coherence time of incident radiation (τc), is derived, especially for incident radiation with a very high effective temperature. The statistical properties of the wave amplitude of the incident radiation is assumed to be described by a gaussian distribution function. In order to evaluate the mean square error in the intensity-correlation measurement, the eighth-order moment of complex wave amplitudes is obtained by using the moment theorem for a gaussian random process. The mean square error of the high-intensity correlation experiment during a finite observation time (τ0) is determined chiefly by the random correlation of noise currents due to intensity fluctuations of the incident radiation. The S/N in the case of extremely high-intensity radiation, thus, is given by , where γ12 is the complex degree of coherence between two observation points independent of the intensity of incident radiation. In the case of low-intensity radiation the mean square error of an intensity-correlation measurement is determined by the random correlation of shot noise in photoelectric currents. The S/N for low-intensity radiation, as originally discussed by Hanbury Brown and Twiss, is given by where n is the number of incident photons per second and η is the quantum efficiency of a photodetector. The S/N for the intermediate case is also discussed.
© 1966 Optical Society of America
Full Article | PDF ArticleMore Like This
J. A. Armstrong
J. Opt. Soc. Am. 56(8) 1024-1031 (1966)
Richard Barakat
J. Opt. Soc. Am. 56(9) 1244-1247 (1966)
David L. Fried, Jim Riker, and Brij Agrawal
J. Opt. Soc. Am. A 31(7) 1536-1546 (2014)