Abstract
Single-layer diffractive optical elements (SLDOEs) have advantages in terms of configuration, fabrication, range of angles and cost; however, the diffraction efficiency decreases sharply with wavelength deviating from the design wavelength, especially for dual-waveband imaging, causing apparent image blur. We propose a point spread function (PSF) model affected by the diffraction efficiency, which is called , and a method of restoring the blurred image to improve imaging performance in dual-waveband infrared systems with an SLDOE. Then, a design example of cooled MWIR and LWIR is presented. Furthermore, imaging simulations with different grades noises and restorations are conducted. Results reveal that the model can significantly improve the image blur caused by the decreased diffraction efficiency.
© 2018 Optical Society of America under the terms of the OSA Open Access Publishing Agreement
1. Introduction
Diffractive optical elements (DOEs) play a significant role in some imaging optical systems because of their unique dispersion property [1]. In dual-waveband infrared systems, optical system components need to be made into complex structures to correct chromatic aberrations because of the lack of available materials. The introduction of DOEs can correct chromatic aberrations and some other aberrations, simplify the system structure, and reduce the system weight [2–4]. Single-layer diffractive optical elements (SLDOEs) have advantages in terms of configuration, fabrication, range of angles and cost. For conventional SLDOEs, the diffraction efficiency decreases sharply when the wavelength deviates from the design wavelength. Images with decreased diffraction efficiency are blurred. Thus, the conventional SLDOEs cannot be generally applied to dual-waveband.
The harmonic diffractive element (HDE) proposed in 1995 can obtain the same optical focus at each wavelength in a series of separated wavelengths [5]. Although it can be used in multi-spectral infrared optical systems, its diffraction efficiency is low. In 2005, the double-layer diffractive optical elements (DLDOEs) were proposed to improve the efficiency in the wide waveband [6]. DLDOEs were designed and fabricated for dual-waveband systems, achieving high diffraction efficiency. In 2010, an optimal design of a multilayer diffractive optical element (MLDOE) for dual-waveband was proposed [7]. The MLDOEs designed using this method achieve high diffraction efficiency in dual-waveband. However, the structure and processes of both DLDOEs and MLDOEs are more complex than those of SLDOEs. Additionally, in comparison with SLDOEs, DLDOEs and MLDOEs have a smaller range of angles. In 2015, the use of lightweight diffractive–refractive optics for computational imaging was proposed to obtain better imaging results [8]. However, in that research, only aberration correction using a thin diffractive element was studied and diffraction efficiency was not examined. In 2016, full spectrum computational imaging with diffractive optics was proposed [9]. The height profile of a DOE was optimized to make different wavelengths get the nearly identical point spread functions (PSFs). Then, image restoration algorithms were used to restore the blurred images. However, this method is for specially designed DOEs and applied to the visible spectrum.
In this paper, we propose an image restoration method to improve imaging performance in dual-waveband infrared systems with an SLDOE. A model of a PSF affected by the diffraction efficiency is constructed, hereinafter referred to as the model. At a wavelength deviating from the design wavelength, this model is established by analyzing the distribution of diffraction energy at all orders. A method of restoring images at the wavelength deviating from the design wavelength is proposed to reduce the influence of low diffraction efficiency. According to this method, a cooled infrared dual-waveband imaging system with an SLDOE is designed to perform imaging in one waveband using the detector and in another waveband via image restoration to reduce the impact of low diffraction efficiency while avoiding the disadvantages of using HDEs and MLDOEs.
2. Method
2.1 Design approach
Figure 1 describes the basic principle of this approach. The scene message enters the specially designed dual-waveband infrared optical system with an SLDOE and is imaged in a dual-waveband detector. The SLDOE design wavelength of this system is in a medium waveband. Because of the high diffraction efficiency in the medium waveband, the imaging performed using the detector is able to meet the imaging requirements [10]. However, in the long waveband, because of the low diffraction efficiency, the images obtained using the detector are blurred because when the light passes through the SLDOE, multiple diffraction orders are generated. A DOE diffracts light deviating from the design wavelength into multiple orders as shown in Fig. 2(a). However, the DOE is used to correct chromatic aberrations; thus, it usually bears a smaller focal power in the system. In general, only the first-order diffraction is considered during design. For light deviating from the design wavelength, only the first-order diffracted light is focused in the image surface, and the light of other orders forms blurred spots in the image surface, which reduces the contrast and resolution of the image as shown in Fig. 2(b). Therefore, the long wave blurred image needs to be restored, and this can be done through the application of the model presented herein. Using this model, relatively clear image results are obtained, so that the system can meet the imaging requirements in the dual-waveband.
2.2 Image restoration approach
It is considered that the output image is a convolution between the input image and the operator in the presence of noise. The PSF is one of the operators. This is presented as follows [11]:
When the image contrast is higher than the noise, PSF can be used to restore the image. The optical system PSF can be given by analysis of optical design software such as Zemax and Code V. However, for the DOEs, optical design software ignores the influence of the diffraction efficiency and considers that each wavelength’s diffraction efficiency is 100% at the given diffraction order. It is therefore necessary to calculate and construct a PSF model that includes the diffraction efficiency for the refractive-diffractive imaging system.
Figure 3 describes the three steps involved in the construction of the model. The first step is to construct the energy distribution of a certain order, , of a single feature wavelength, , with different diffraction efficiencies at different diffraction orders. Only the first-order diffraction is focus in the image surface. The diffraction efficiency is actually the expression of the normalized diffraction energy. Assuming that the energy of the incident light wave is 100% at a single feature wavelength, it is considered that the energy of the imaging point on the image has the first-order diffraction efficiency , and that the rest of the orders at this wavelength have different focal lengths. Thus, a blurred spot of radius is formed on the image surface, and is converted into the number of pixels N on the detector. The wavelength energy in the blurred spot is the order diffraction efficiency , and the diffraction efficiency is evenly distributed in the blurred spot. Thus, the energy distribution of a certain order, , of the wavelength, , can be obtained.
The second step is to obtain a single feature wavelength given by . According to the first step of the method, the energy distribution of all analyzed orders of a single feature wavelength is obtained. The energies are accumulated to obtain as follows:
The third step is to discretize the analysis wavebands according to and the detector size S to select the feature wavelength. According to the first and second steps for each feature wavelength, all feature wavelengths are obtained at all orders in the analysis. Then, the feature wavelength weight coefficient is obtained. Different feature wavelengths in the analysis waveband are accumulated according to different weights and normalized to obtain the PSF model as follows:
3. model construction
The steps that follow are used to determine the parameters outlined in the model construction steps.
3.1 Diffraction efficiency
The diffraction efficiency of an SLDOE can be expressed as follows [12]:
where is the diffraction order, is the wavelength, and is the design wavelength.3.2 Blurred spot radius affected by the location of the stop
The blurred spot radius is mainly related to the focus length of the SLDOE, which is determined by the phase delay which can be expressed as follows [12]:
where is a coefficient and is also the SLDOE phase optimization variable and is the SLDOE radius.The SLDOE focal length is expressed as
The change in order and wavelength causes the focal length of the SLDOE to change. However, the focal length of the front optical system is a fixed value so that the overall focal length of the system and the system's back focal length change at the same time. Through ray tracing, the system's back focal length of the design focal length can be obtained. Because of the changes in either the order or wavelength, the system's back focal length changes to .
Infrared detectors are divided into cooled detectors and uncooled detectors. According to different detectors, they can be classified into uncooled infrared systems and cooled infrared systems. The cooled infrared detectors need to be placed in Dewars to reduce the effect of stray light from the optical system to achieve 100% cold stop efficiency. The stop is usually placed in the Dewars. In particular, the stop is placed on the back of the system to become a cold stop. The cold stop will also block the diffraction stray orders of light. At the same time, the last lens has different effective apertures at different focal points to ensure that all the main rays of the field of vision pass through the center of the cold stop. Thus, it is required to take corresponding aperture weight . Uncooled infrared system usually does not need to place the stop back. Therefore, we do not consider the influence of the stop for stray light in the stray orders while the weight of the aperture is 1, so the distribution of stray light in the stray order needs to be divided according to the general situation and postposition of the stop, as shown in Fig. 4.
In general, the radius of the last lens is formed by taking the new focal point after the focal length change as the base point and generating a ray incident to the edge of the last lens. The blurred spot radius of diffracted stray light on the image surface is then given as follows:
The postposition of the stop, the radius of the cold stop is , and the distance from the last lens of the system to the cold stop is . Consider the new focal point after focal length change as the base point. A ray is then generated that is incident to the aperture edge of the cold stop and intersects the diffraction surface at a point. In this scenario, the blurred spot radius formed by the diffraction stray light on the image surface is given as follows:
Aperture weight : Because of the postposition of the stop, the cold aperture blocks the stray orders of light at different wavelengths. Thus, the last lens has different effective clear apertures. Assuming that the incident energy is 100%, the total light energy of different orders will be diffraction efficiency multiplied by the aperture weight . The clear aperture of the analysis order and analysis wavelength at the last lens can be obtained as follows:
Because the optical system has a circular aperture, the aperture weight is given as:
Assuming that the pixel size of the detector is a number S, the number of pixels N of the blurred radius of the image on the detector dispersion radius is
where represents rounding.3.2 Energy distribution construction
Because the optical system is a radially symmetric circular aperture, when the field of view is not large, each field of view is considered as a circular blurred spot on the image surface, and the percentage of energy divided per pixel can be given as follows:
can be expressed as a matrix. The maximum size of the matrix is the size corresponding to the maximum blurred spot. The blurred spot is represented by a matrix as an approximately circular shape. The radius of the circle is N. Each value in the circle is and the rest of the matrix is 0. The matrix is expressed as:
For a particular first-order diffraction, , with a matrix, the area of the central spot size of the matrix is the first-order diffraction efficiency of the wavelength and the rest is 0. This matrix is expressed as follows:
3.3 Feature wavelength
The second step is to repeat the first step and calculate the of the wavelength at each feature wavelength. This requires the selection of a feature wavelength related to the pixel size of the detector. When a change in the wavelength of a diffusion spot occurs as a pixel size on the detector, the wavelength in this interval is considered to have the same PSF. When this wavelength, , increases by , N increases by 1, which is expressed as follows:
where is the wavelength interval.3.4 Wavelength weight
The third step involves performing weighted superposition summation and normalization on the results obtained in the previous two steps in which the weight coefficients of each wavelength need to be calculated. For the infrared optical system, the infrared detector responds to the radiant flux , which can be obtained according to Planck’s formula for blackbody radiation
where T is temperature, is the first radiation constant, and is the second radiation constant.At a certain temperature, different wavelengths in a certain waveband range have different radiant fluxes and the weight of the feature wavelength is defined according to the weights of the different discrete wavelengths in the overall waveband interval range, i.e.,
4. Design example
4.1 Optical system design
We designed a cooled MWIR and LWIR system with a working temperature of 300 K and working wavebands ranging from 3.7 to 4.8 µm and from 7.7 to 10 µm. The system is based on a cooled infrared detector with a pixel size of 30 µm and an array size of 320 × 256 pixels. The initial system used adopts infrared objective lens systems [13]. The system-specific parameters are listed in Table 1.
The optical system is optimized for medium and long wavelengths. After optimization, the optical system includes a four-lens design with one piece of Ge, one piece of ZnS, and two pieces of ZnSe. The stop is located behind the optical system, with the last surface being the diffraction surface and the rest being spherical surfaces as shown in Fig. 5.
After optimization, the MTFs of the system are as shown in Fig. 6. The MTFs of the MWIR are higher than 0.77 at 17 lp/mm, and the MTFs of the LWIR are higher than 0.54 at 17 lp/mm.
4.2 model construction
The basic material of the diffraction surface is Ge and the design wavelength is 4.2 µm. The diffraction efficiency is calculated using Eq. (4), as shown in Fig. 7. From Fig. 7, it can be deduced that the diffraction efficiency in the 3.7-4.8 µm waveband is higher than 94%, and thus, good image quality can be achieved. The diffraction efficiency in the 7.7-10 µm waveband is only ~37.548%, therefore, the diffraction efficiency is too low, which results in reduced image contrast and resolution, rendering this waveband unusable for imaging purposes. In the long waveband, with the exception of the first-order diffraction efficiency of the design series, the zeroth-order diffraction efficiency is higher than the other order diffraction efficiency at less than 4.3%. To simplify the calculation, only the effect of zeroth-order diffraction on imaging is discussed.
At zeroth-order diffraction, the diffractive element does not provide optical power. At this instance, the back-focus length of the optical system is obtained by tracing light and the blurred radius of each wavelength at zeroth-order diffraction is obtained by Eq. (8). According to the segmentation principle, the 7.7–10 µm waveband is segmented to obtain a number of feature wavelengths. Equation (10) is used to obtain the aperture weight for each wavelength, and Eq. (12) is used to obtain the intensity of each pixel of the detector. Additionally, the weights of the segmented wavelengths are determined by Eq. (17), where the target temperature is assumed to be 300 K. The value for the actual weight is considerably small and is thus not conducive to the calculation. Consequently, the weight is only included as a coefficient of proportionality and does not affect the calculation. Therefore, the weight coefficient is enlarged by to facilitate subsequent calculations. The results are summarized in Table 2.
The PSFs at discrete wavelengths are superposed and normalized according to Eq. (3) to obtain the overall , as shown in Fig. 8(a). It can be clearly seen that the first-order diffraction energy is higher than other diffraction orders. Figure 8(b) is a partial enlargement of Fig. 8(a), and it can be observed that the effects of different wavelengths at the same order are different.
5. Results and discussion
The model of the infrared dual-waveband optical system is constructed via the analysis and examples. From the example, the average diffraction efficiency of the first-order in the long waveband is 37.548%, the average diffraction efficiency of the zeroth-order is 43.931%, and the average diffraction efficiency of stray order is higher than that of imaging order. However, the light of the stray order is focused in a spot. The normalized average pixel energy of the zeroth-order in the spot is 8.1633e-04 by calculating. The optical system is optimized and its performance is close to the diffraction limit; thus, the light of first-order diffractive is focused in one pixel with energy of 0.5145, which is 630.26 times the energy of the zeroth-order diffraction single-pixel. It means that although the diffraction efficiency of the first-order is low when the SLDOE works in long-waveband, it can still maintain a certain contrast, which provides more possibilities for subsequent restoration.
The imaging process is affected by various noises. There are many factors that produce noises. However, these noises can be classified as Gaussian noise, Rayleigh noise and Salt-and-Pepper noise according to the distribution. Gaussian noise is the main noise during imaging process [11]; thus, we chose to add two Gaussian noises with the same mean value of 0 and different variances values of 0.001 and 0.003 to the simulation results.
This paper uses Matlab to simulate imaging of the 7.7-10 µm waveband, and uses the model constructed in this paper to restore the image through the Richardson-Lucy algorithm which considers the effects of noise [14]. The results are divided into outdoor and indoor scenes as shown in Fig. 9.
Using the methods proposed herein, images can be improved after being analyzed and restored. The final imaging results need to be assessed to determine the change in quality. The assessment methods mainly include subjective and objective assessments. The subjective assessment is primarily based on the human eyes. The objective assessment principally requires the use of assessment functions. This study employs Gray Mean Grads (GMG) [15], Laplacian Sum (LS) [15], and a two-step framework for constructing Blind Image Quality Indices (BIQI) [16] as the image quality assessment functions. The higher the value of the GMG and LS functions, the better the image quality. On the contrary, the lower the BIQI function value, the better the image quality. In the 7.7-10 µm waveband, the image can be improved after being analyzed and restored using the method proposed herein. The final image results also need to be evaluated and their influence on the diffraction efficiency must be apparent. The aforementioned assessment functions are used to compare the blurred and restored images in the scene 1 and 2 presented in Table 3 and 4.
Through an objective analysis, it is evident from the data calculated in Table 3 and Table 4 that the values for GMG and LS are smaller for the blurred image in comparison with those for the restored image from different scenes. Furthermore, the value of BIQI is larger for the blurred image in comparison with that for the restored image from different scenes. This objectively proves that low diffraction efficiency does cause a decrease in image quality and that the proposed method can improve the image quality.
Through a subjective analysis of Figs. 9(a-1), 9(b-1), 9(c-1), 9(d-1), 9(e-1) and 9(f-1), we can observe that the scenes pass through this optical system and the diffraction efficiency is considerably low in the long waveband, resulting in blurred images. After analysis and restoration, the images that are devoid of noise are clearly resolved, as shown in Fig. 9(a-2) and 9(d-2). From the local enlargements, the originally blurred lines have a marked increase in contrast after processing and the high frequency parts are strengthened. The clarity of the restored images affected by the Gaussian noise with variance of 0.001 is obviously better, as shown in Fig. 9(b-2) and 9(e-2). By comparing the partial enlargements from Fig. 9(b-2) and 9(e-2), more detailed lines are evidently restored and the quality of images is improved. The outline of the restored overall images affected by the Gaussian noise with variance of 0.003 is clearer, as shown in Fig. 9(c-2) and 9(f-2). However, the partial enlargements of Fig. 9(c-2) and 9(f-2) show that the detail line cannot be restored due to the presence of noise, the main reason for this is that the noise is higher than the contrast of the image.
The method proposed in this paper can evidently improve the image quality and restore fine details of image when the contrast is higher than the noise, that is, it improves the first order diffraction efficiency of the system, and solves the problem of blurred images caused by the low diffraction efficiency.
6. Conclusion
This paper analyzed the reasons behind the decrease in the image quality of SLDOEs when wavelength deviates from the design wavelength. According to the position of the stop, the diffraction energy distribution of each wavelength and order, the pixel size of the detector, and the feature wavelengths were selected. The energies of all orders and feature wavelengths were calculated, accumulated, and normalized according to certain weights to construct a model. This model was used to restore images with a wavelength deviating from the design wavelength to improve the image quality. Herein, a dual-waveband infrared optical system with an SLDOE was designed. The design wavelength of the SLDOE was 4.2 µm which makes it possible to obtain higher diffraction efficiency in the medium wavelength, ensuring that the image quality requirement can be satisfied. In the long waveband, the proposed method was used to restore images with different grades of noises to satisfy the imaging requirements. The results revealed that the image quality of the restored image was better than that of the blurred image obtained through subjective analysis, as well as the GMG, LS and BIQI assessment functions when the contrast was higher than the noise. Finally, the SLDOE was applied to the dual-waveband infrared optical system without being limited by the diffraction efficiency.
Funding
China Government (51-H34D01-8358-13/16).
References and links
1. G. J. Swanson, “Binary optics technology: the theory and design of multilevel diffractive optical elements,” MIT Lincoln Laboratory Rep. 854 (MIT, Cambridge, Mass, 1989).
2. M. J. Riedl, “Design Example for the use of hybrid optical elements in the infrared,” Appl. Opt. 35(34), 6833–6834 (1996). [CrossRef] [PubMed]
3. N. Davidson, A. A. Friesem, and E. Hasman, “Analytic design of hybrid diffractive-refractive achromats,” Appl. Opt. 32(25), 4770–4774 (1993). [CrossRef] [PubMed]
4. D. Elkind, Z. Zalevsky, U. Levy, and D. Mendlovic, “Optical transfer function shaping and depth of focus by using a phase only filter,” Appl. Opt. 42(11), 1925–1931 (2003). [CrossRef] [PubMed]
5. D. W. Sweeney and G. E. Sommargren, “Harmonic diffractive lenses,” Appl. Opt. 34(14), 2469–2475 (1995). [CrossRef] [PubMed]
6. A. Wood, M.-S. L. Lee, and S. Cassette, “Infrared hybrid optics with high broadband efficiency,” Proc. SPIE 5874, 58740G (2005). [CrossRef]
7. C. Xue, Q. Cui, T. Liu, L. Yang, and B. Fei, “Optimal design of a multilayer diffractive optical element for dual wavebands,” Opt. Lett. 35(24), 4157–4159 (2010). [CrossRef] [PubMed]
8. Y. Peng, Q. Fu, H. Amata, S. Su, F. Heide, and W. Heidrich, “Computational imaging using lightweight diffractive-refractive optics,” Opt. Express 23(24), 31393–31407 (2015). [CrossRef] [PubMed]
9. Y. Peng, Q. Fu, F. Heide, and W. Heidrich, “The diffractive achromat full spectrum computational imaging with diffractive optics,” ACM Trans. Graph. 35(4), 31 (2016). [CrossRef]
10. M. J. Riedl, Optical Design Fundamentals For Infrared Systems (SPIE, 2001).
11. R. C. Gonzalez and R. E. Woods, Digital Image Processing (Pearson Education, 2008).
12. D. C. O. Shea, T. J. Suleski, and A. D. Kathman, Diffractive Optics: Design, Fabrication, and Test (SPIE, 2004).
13. I. A. Nei, Strathblane, Scotland, “Infrared Objective Lens Systems,” U.S. Patent 4,505,535,(1985).
14. D. Fish, A. Brinicombe, E. Pike, and J. Walker, “Blind deconvolution by means of the Richardson–Lucy algorithm,” J. Opt. Soc. Am. A 12(1), 58–65 (1995). [CrossRef]
15. Y. Zeng, J. Lan, B. Ran, Q. Wang, and J. Gao, “Restoration of motion-blurred image based on border deformation detection: a traffic sign restoration model,” PLoS One 10(4), e0120885 (2015). [CrossRef] [PubMed]
16. A. K. Moorthy and A. C. Bovik, “A two-step framework for constructing blind image quality indices,” IEEE Signal Process. Lett. 17(5), 513–516 (2010). [CrossRef]