Abstract
In this paper we present some computer simulation results on the band-limited signal extrapolation problem. First, the performances of several existing algorithms are compared for the noise-free case. We then describe some modifications of these algorithms for computing the extrapolation when the given signal is contaminated with noise. Computer simulation results for both the noiseless and noisy cases are included. From these results, the following preliminary conclusion can be drawn: Two-step algorithms appear to give better reconstructions and require less computing time than the iterative algorithms considered in this paper.
© 1984 Optical Society of America
Full Article | PDF ArticleMore Like This
Jorge L. C. Sanz and Thomas S. Huang
J. Opt. Soc. Am. 73(11) 1455-1465 (1983)
Robert J. Marks
Appl. Opt. 19(10) 1670-1672 (1980)
Junji Maeda and Kazumi Murata
J. Opt. Soc. Am. A 1(1) 28-34 (1984)