Abstract
A coherent-dispersion stereo-imaging spectrometer is presented, which combines three-view stereo imaging, interferometric spectroscopy and dispersive spectroscopy. Three area-array detectors record three spectral images of each scene unit from three views. For each of the three views, each scene unit is imaged on a given column of one area-array detector, and different wavelengths are dispersed across different rows of that column. For each scene unit, multiple interferograms are simultaneously generated at each view, each interferogram covering a separate wavelength range and located in a separate pixel. The orthographic view image is used to create a two-dimensional orthophoto image. The front view and back view images are used to reconstruct the three-dimensional stereoscopic image. Preliminary theoretical calculations are given. The instrument is a unique concept to obtain three-dimensional spatial information and one-dimensional spectral information while achieving high spectral resolution measurement of an ultraviolet-visible broadband spectral range (e.g., 0.05 nm at 450 nm together with 0.1 nm at 700 nm). It will be suitable for ultraviolet-visible hyperspectral remote sensing.
© 2019 Optical Society of America under the terms of the OSA Open Access Publishing Agreement
1. Introduction
All types of existing imaging spectrometers acquire two-dimensional spatial information and one-dimensional spectral information (i.e., a three-dimensional data cube) of an object or a scene [1–3]. The spectral image can be obtained by the following methods: (1) Wavelength-scan methods that measure the images one wavelength at a time. Typical representative of this method is color filter imaging spectrometer that mainly uses either a circular-variable filter [4], a liquid-crystal tunable filter [5–8] or an acousto-optical tunable filter [9–15] to capture a full spectral image by measuring one image at a time but each time at a different wavelength. (2) Spatial-scan methods that measure the whole spectrum of a portion of the image at a time and scan the image (e.g., line by line). Typical representative of this method is dispersive imaging spectrometer that uses a prism, a grating, or both to achieve the dispersion of light and to collect the entire spectral image by measuring only one line of the object at a time and by scanning the object line-by-line (e.g., whisk-broom scanning [16], push-broom scanning [17]). (3) Time-scan methods that measure a set of images where each one of them is a superposition of spectral or spatial image information and therefore require a mathematical transformation of the acquired data to derive the spectral image (e.g., by Fourier transform or Hadamard transform [18]). Typical representative of this method is interferometric imaging spectrometer (i.e., Fourier transform imaging spectrometer) that is based on temporal interferometry or spatial interferometry [19–34]. (4) Compromise methods that measure the whole spectral image simultaneously, but compromise on the number of points in the spectrum, the field of view (FOV) or spatial resolution. Typical representatives of this method are computed-tomography imaging spectrometer [35–41] and snapshot spectral imagers [42–49]. (5) Spatial-Time joint scan method that uses a time-scan method to measure the spectral image of each line of the object and also uses the spatial-scan method to obtain the entire spectral image by scanning the object line-by-line (i.e., push-broom scanning). Typical representative of this method is coherent-dispersion imaging spectrometer [50].
When measuring an infrared broad-band source or an ultraviolet-visible narrow-band source, interferometric spectrometry (especially temporal interferometry based on linear scanning, such as Michelson-type interferometer) can perform high spectral resolution measurements. Nevertheless, when measuring an ultraviolet-visible broad-band source, the interferometric spectrometry performance is weakened [27,51–60]. That is, interferometric spectrometry is only suitable for low and medium spectral resolution measurements of a broad-band source in the ultraviolet-visible spectral region. Interestingly, the coherent-dispersion spectrometry is an effective and unique method to perform high spectral resolution measurements of a broadband source in the ultraviolet-visible spectral region [50,59, 60].
On the other hand, three-dimensional imaging technologies can potentially capture the three-dimensional structure, range, and texture information of an object. According to the characteristics of the reconstructed images, three-dimensional imaging technologies can be roughly classified into four categories: binocular, multi-view, volumetric imaging, and spatial imaging [61,62]. Typical representatives of spatial imaging are holographic imaging and integral imaging [63–68]. It is a pity that all existing imaging spectrometers are based on two-dimensional imaging technologies.
This paper presents a coherent-dispersion stereo-imaging spectrometer, which can not only obtain three-dimensional stereoscopic image, but also perform high spectral resolution measurements of a broadband source in the ultraviolet-visible spectral region. After a detailed description of the principle, preliminary numerical calculations are illustrated by an example for the spectral range from 400 nm to 700 nm. Finally, the conclusion is given.
2. Principle
Figure 1 shows the equivalent orthographic view for the optics of the three-area-array coherent-dispersion stereo-imaging spectrometer (CDSIS), and Fig. 2 shows the equivalent light path diagram of the CDSIS. The CDSIS consists of a moving corner-cube-mirror interferometer, a plane transmission grating, an objective lens, a collimating lens, a collecting lens, three entrance slits, and three area-array detectors. The moving corner-cube-mirror interferometer comprises one moving corner-cube mirror (CCM), one fixed CCM, and one beam splitter. The optical path difference is created by the linear scanning of the moving CCM driven by a linear actuator. The interferometer has neither tilt nor shearing problems. The back focal plane of the objective lens is coincident with the front focal plane of the collimating lens. Three entrance slits (S1, S2 and S3) are located at the front focal plane of the collimating lens (i.e., back focal plane of the objective lens) and are parallel to each other. The entrance slit S2 is located in the center of the field of view of the optical system, so it intersects the optical axis of the optical system. The entrance slits S1 and S3 are symmetric about the entrance slit S2. The three entrance slits are perpendicular to the flight direction of the instrument platform (e.g., satellite), but parallel to the linear scanning direction of the moving CCM. Three area-array detectors (A, B and C) are located at the back focal plane of the collecting lens. The area-array detector B is located in the center of the field of view of the optical system, so it receives the orthographic view light passing through the entrance slit S2. The area-array detector A receives the front view light passing through the entrance slit S1. The area-array detector C receives the back view light passing through the entrance slit S3. In one scan period of the moving CCM, the CDSIS simultaneously records the spectral images of three scan lines on the scene: the detector A records the front view spectral image of one scan line on the scene, the detector B records the orthographic view spectral image of another scan line on the scene, and the detector C records the back view spectral image of another scan line on the scene. When the CDSIS is spatially scanned perpendicular to the entrance slits: the detector A records the front view consecutive spectral image of the scene, the detector B records the orthographic view consecutive spectral image of the scene, and the detector C records the back view consecutive spectral image of the scene. Namely, the CDSIS records three consecutive spectral images of front view, orthographic view and back view of the scene when it is spatially scanned perpendicular to the entrance slits.
The CDSIS combines interferometric spectroscopy with dispersive spectroscopy to get spectral information of an object or a scene. The CDSIS integrates a moving corner-cube-mirror interferometer and a plane transmission grating to obtain spectral information. For each unit of the scene, the CDSIS simultaneously produces multiple interferograms in one scan period of the moving CCM, each interferogram with a separate wavelength range and located in a separate pixel of a given column of one area-array detector. More specifically, for each scene unit, not only the transmission grating spreads the spectral information onto a given column of one area-array detector, but also the interferometer simultaneously generates multiple interferograms in one scan period of the moving CCM, each interferogram covering a separate wavelength range controlled by the transmission grating and detector geometry, each interferogram located in a separate pixel of the given column above. In other words, the transmission grating divides a broad-band spectrum into a number of narrow-band spectra, each narrow-band spectrum is located on a separate pixel of a given column of the detector, and the interferometer simultaneously produces multiple interferograms for the broad-band spectrum in one scan period of the moving CCM, each interferogram corresponds to a different narrow-band spectrum. Thus any noise present in an optical signal is limited to the pixels where this noisy signal impinges on, and these noise signals have no effect on the interferograms with different narrow-band spectra. Moreover, the center fringes are spread into multiple separate pixels (interferograms), each with a separate narrow-band spectrum, thus the dynamic range requirement of the detector is reduced [69,70]. Therefore, the CDSIS has the advantage of providing the high spectral resolution of interferometry, while avoiding the multiplex disadvantage of interferometry for a broadband source in the ultraviolet-visible spectral region, and reducing the dynamic range requirement of the detector. The CDSIS can perform high spectral resolution measurements of a broadband source in the ultraviolet-visible spectral region.
Throughout this paper, rows of each area-array detector are parallel to the y-axis of the detector plane, columns of each area-array detector are parallel to x-axis of the detector plane, is the center wavelength of a source spectra covering a wavelength range from to (i.e., a wavenumber range from to if ), is the displacement of the moving CCM from the zero path difference (ZPD) position, is the focal length of the objective lens, is the focal length of the collimating lens, and is the focal length of the collecting lens.
In Fig. 1, typical orthographic view light rays emerging from two representative object points on a scan line (in Y-axis direction of the object coordinate system O-XYZ) of the scene are drawn. One point is located on the optical axis of the objective lens. The second point is displaced by a distance from the optical axis of the objective lens. Moreover, six representative image points are shown: is the image point of object point on the back focal plane of the objective lens, is the image point of object point on the back focal plane of the objective lens, is the image point on area-array detector B of formed by wavelength , is the image point on area-array detector B of formed by wavelength , is the image point on area-array detector B of formed by wavelength , is the image point on area-array detector B of formed by wavelength . It can be obtained that
In Fig. 2, typical front view, orthographic view and back view light rays are received by the area-array detectors A, B and C, respectively. , and are the x-axis coordinates on the detector plane for the wavelengths , and of the front view light. , and are the x-axis coordinates on the detector plane for the wavelengths , and of the orthographic view light. , and are the x-axis coordinates on the detector plane for the wavelengths , and of the back view light. is the x-axis coordinate of the entrance slit S1 on the back focal plane of the objective lens, is the x-axis coordinate of the entrance slit S2 on the back focal plane of the objective lens, and is the x-axis coordinate of the entrance slit S3 on the back focal plane of the objective lens. is the angle between the grating normal and the optical axis of the optical system. , , so . It can be obtained that
Figure 3 shows the equivalent light path diagram from the transmission grating to the detector plane for the orthographic view light. The grating equation for a plane transmission grating is given by [71]
where is the diffraction order that is an integer, is the wavelength of light, is the groove spacing of grating, is the refractive index of the plane transmission grating for wavelength , is the incidence angle measured from the grating normal, is the m-order diffraction angle measured from the grating normal for wavelength .Let the optical axis of the collecting lens overlap with the 1-order diffracted light ray from the grating of the center wavelength of the orthographic view light, thus the x-axis coordinate on the detector plane of the center wavelength of the orthographic view light is .
According to the grating equation, the characteristics of the lens and the geometry, it can be obtained that
From Eqs. (6)-(10), it can be obtained that whereFor the front view light, the incidence angle measured from the grating normal is (see Fig. 2), it can be obtained that
whereFor the back view light, the incidence angle measured from the grating normal is (see Fig. 2), it can be obtained that
whereAccording to Eq. (2), it can be obtained thatThe CDSIS is based on three-view stereo imaging to obtain three-dimensional spatial information (i.e., three-dimensional stereoscopic image) of a scene or an object. The CDSIS images each scene unit on a separate column of one area-array detector at each of the three views. More specifically, for each of the three views, each scene unit is imaged on a given column of one area-array detector, and different wavelengths are dispersed across different rows of that column. When the CDSIS is spatially scanned perpendicular to the entrance slits, it records three images of front view, orthographic view and back view of each scene unit. Figure 4 shows the equivalent schematic diagram of the three-view stereo imaging for the CDSIS. The instrument platform (e.g., satellite) flies from left to right along the X-axis direction of the object coordinate system O-XYZ. At time , an object point on a scan line (i.e., green thick line parallel to the Y-axis direction in Fig. 4) of the scene is imaged on the entrance slit S1, and the coordinates of this front view image point are . At time , the same object point is imaged on the entrance slit S2, and the coordinates of this orthographic view image point are . At time , the same object point is imaged on the entrance slit S3, and the coordinates of this back view image point are . With the flight of the instrument platform (e.g., satellite), three consecutive two-dimensional track images of front view, orthographic view and back view of the scene are recorded, respectively, by three area-array detectors (A, B and C). The orthographic view image is used to create a two-dimensional orthophoto image. The front view and back view images are used to reconstruct the three-dimensional stereoscopic image.
is the flight height of the instrument platform (e.g., satellite), is the angle between the front view axis (back view axis) and the orthographic view axis, and is the ground distance of the scene corresponding to the front view and the orthographic view (i.e., the instantaneous imaging half width of the scene in the flight direction). It can be obtained that
The relationship between the coordinates of the image point on the back focal plane of the objective lens and the coordinates of the object point on the scene can be expressed by a matrix of image construction, if they are both defined in the object coordinate system O-XYZ. The matrix of image construction can be written as
where is a scale factor, is the coordinates of the instrument platform (e.g., satellite) in the object coordinate system O-XYZ, and is a rotation matrix consisting of the cosine of the azimuth angle of the three axes of the instrument platform (e.g., satellite), where each axis has three azimuth angles , and . Let denote the direction cosines of the three axes of the instrument platform (e.g., satellite), the matrix is given bywhereSolving the matrix of image construction, the collinearity equations can be given by
According to Eq. (28) and Eq. (29) together with Eq. (13), Eq. (15), Eq. (17), Eq. (19), Eq. (21) and Eq. (22), for the front view and back view of an object point on the scene, we can get four equations and use the least square method to solve three unknowns , and .
Assume that the pixel size of each area-array detector is , and the number of pixels in each row (in y-axis direction) of each area-array detector is . Based on Eqs. (22) and (23), the ground sample distance (GSD) across the flight direction (i.e., the spatial resolution of the CDSIS in Y-axis direction) is given by
the field of view (FOV) across the flight direction is given bythe ground imaging width across the flight direction isand the length of the entrance slit isThe elevation uncertainty in Z-axis direction is given by
where is a coefficient that is related to the registration accuracy of the front view and back view images when reconstructing the stereoscopic image.According to Eq. (11) and Eq. (12), the size of each column (in x-axis direction) of the area-array detector B must be greater than , and the number of pixels in each column of the area-array detector B used to acquire multiple interferograms should be
According to Eq. (14) and Eq. (16), the size of each column (in x-axis direction) of the area-array detector A must be greater than , and the number of pixels in each column of the area-array detector A used to acquire multiple interferograms should be
According to Eq. (18) and Eq. (20), the size of each column (in x-axis direction) of the area-array detector C must be greater than , and the number of pixels in each column of the area-array detector C used to acquire multiple interferograms should be
Namely, for each scene unit, separate interferograms are simultaneously generated and recorded by a separate column of the detector B, separate interferograms are simultaneously generated and recorded by a separate column of the detector A, and separate interferograms are simultaneously generated and recorded by a separate column of the detector C. Furthermore, for each scene unit, the interferogram q recorded by the row q of a separate column of the detector B covers a wavelength range from to , which is determined by . and are the x-axis coordinates on the detector plane for the wavelengths and of the orthographic view light.The optical path difference (OPD) for each interferogram recorded by area-array detector B and generated by the light emerging from the object point is given by [21,28]
The interferogram recorded by area-array detector B and generated by the light emerging from the object point is given by
where is the wave number, is the input spectral intensity at a wavenumber , is the displacement of the moving CCM from the zero path difference (ZPD) position. For the object point , , the optical path difference simplifies to be , and Eq. (41) simplifies to be .The OPD for each interferogram recorded by area-array detector A (or C) and generated by the light emerging from the object point is given by
The interferogram recorded by area-array detector A (or C) and generated by the light emerging from the object point of the scene is given by
The theoretical spectral resolution for the orthographic view spectral image of the three-area-array CDSIS can be calculated by
where is the maximum displacement of the moving CCM from the ZPD position. Furthermore, in order to improve the signal-to-noise ratio of the recovered spectrum, the truncation and apodization (e.g., the triangular function) of the interferogram will also reduce the spectral resolution. In practice, the designed spectral resolution for the orthographic view spectral image of the CDSIS can be given byThe theoretical spectral resolution for the front view (or back view) spectral image of the three-area-array CDSIS can be calculated by
In practice, the designed spectral resolution for the front view (or back view) spectral image of the CDSIS can be given by
For a source spectra covering a wavelength range from to , if , the maximum wave number of the source spectra is . According to Nyquist criterion, for convenience, the sampling interval for each interferogram produced by the CDSIS can be . The number of sampling points for each unilateral interferogram recorded by the area-array detector B and generated by the light emerging from the object point can be given by
and the number of sampling points for each unilateral interferogram recorded by the area-array detector A (or C) and generated by the light emerging from the object point can be given byTherefore, the resolving power for each interferogram recorded by the area-array detector B and generated by the light emerging from the object point is determined by , and the resolving power for each interferogram recorded by the area-array detector A (or C) and generated by the light emerging from the object point is determined by . That is, the resolving power is determined by the number of sampling points for each unilateral interferogram.Table 1 shows the comparisons of the three-area-array CDSIS, the coherent-dispersion imaging spectrometer (CDIS) in [50] and the coherent-dispersion spectrometer (CDS) in [59]. Table 2 shows the comparisons of the CDSIS, the interferometric imaging spectrometer, the dispersive imaging spectrometer, the color filter imaging spectrometer, and the snapshot spectral imager. Table 3 shows the comparisons of the CDSIS, the broadband snapshot imaging spectrometer (BSIS) in [46], the infrared broadband snapshot imaging spectrometer (IBSIS) in [47], and the orthogonal-dispersion device in [72].
Both the Andor iDus CCD and Newton CCD spectroscopy detectors of Oxford Instruments are suitable for the CDSIS. For example, the Andor iDus 420 Series 1024 x 255 pixel Spectroscopy CCD, the Andor Newton 920 Series 1024 x 255 pixel Spectroscopy CCD, the Andor Newton 940 Series 2048 x 512 pixel Spectroscopy CCD, and so on [73,74].
By dispersing the light with a plane transmission grating instead of a prism, mainly because the grating can provide higher and more linear dispersion of the wavelength than the prism [60]. However, the prism possesses a wider spectral range than the grating [75].
3. Preliminary theoretical calculation and numerical simulation
Assume that the source spectrum covers a wavelength range from 400 nm to 700 nm, i.e., a wavenumber range from 14285.7 cm−1 to 25000 cm−1. Table 4 shows the wavelength difference versus wavenumber difference for several wavelengths. If the desired spectral resolution of the CDSIS is 0.05 nm at 450 nm together with 0.1 nm at 700 nm, the designed spectral resolution (in wavenumber) of the CDSIS should be . From Eq. (45), the maximum displacement of the moving CCM from the ZPD position should be . Based on Eq. (48), the number of sampling points for each unilateral interferogram recorded by the detector B and generated by the light emerging from the object point is .
Suppose that the flight height of the instrument platform (e.g., satellite) is , the instantaneous imaging half width in the flight direction is , the ground imaging width across the flight direction is , the ground sample distance is , the pixel size of each area-array detector is , the focal length of the objective lens is , the focal length of the collecting lens is . Thus the focal length of the collimating lens is , the angle between the front view axis (back view axis) and the orthographic view axis is , the field of view across the flight direction is , the number of pixels in each row (in y-axis direction) of each area-array detector is , and the length of the entrance slit is .
Fused silica is a proper material for the plane transmission grating, and its refractive index formula is given by [76]
Assume that the angle between the grating normal and the optical axis of the optical system is , the center wavelength is , and the transmission grating with 150 grooves/mm. According to Eq. (4), Eqs. (11)-(21) and Eq. (50), the x-axis coordinates on the detector plane of the wavelengths from to for front view light, orthographic view light and back view light are shown in Fig. 5. The number of pixels in each column of area-array detector B is . Namely, for each scene unit, separate interferograms are simultaneously generated and recorded by a separate column of the detector B. The number of pixels in each column of area-array detector A should be . The number of pixels in each column (in x-axis direction) of the area-array detector C should be .
According to Eqs. (11)-(13) and Eq. (41), three representative interferograms recorded simultaneously by a separate column of area-array detector B in one scan period of the moving CCM and generated by the light emerging from object point are shown in Fig. 6. The first representative interferogram contains only wavelength 449.95 nm, 450 nm, 450.05 nm and 450.1 nm. The second representative interferogram contains only wavelength 549.93 nm, 550 nm, 550.07 nm and 550.14 nm. The third representative interferogram contains only wavelength 699.7 nm, 699.8 nm, 699.9 nm and 700 nm. Figure 7 shows the spectrum obtained from Fourier transform of the three interferograms shown in Fig. 6 and its distribution in a given column of area-array detector B.
Figure 8 shows the detailed spectrum obtained from Fourier transform of the three interferograms shown in Fig. 6. In order to better show the detailed spectrum, there are three pictures in Fig. 8: the top picture is obtained from Fourier transform of the first interferogram (in Fig. 6) that contains only wavelength 449.95 nm, 450 nm, 450.05 nm and 450.1 nm; the middle picture is obtained from Fourier transform of the second interferogram (in Fig. 6) that contains only wavelength 549.93 nm, 550 nm, 550.07 nm and 550.14 nm; and the bottom picture is obtained from Fourier transform of the third interferogram (in Fig. 6) that contains only wavelength 699.7 nm, 699.8 nm, 699.9 nm and 700 nm. In the top picture of Fig. 8, from left to right, the four peaks correspond to the spectra of wavelengths 449.95 nm, 450 nm, 450.05 nm and 450.1 nm, respectively. In the middle picture of Fig. 8, from left to right, the four peaks correspond to the spectra of wavelengths 549.93 nm, 550 nm, 550.07 nm and 550.14 nm, respectively. In the bottom picture of Fig. 8, from left to right, the four peaks correspond to the spectra of wavelengths 699.7 nm, 699.8 nm, 699.9 nm and 700 nm, respectively. Twelve spectral peaks are clearly visible. It can be easily obtained that the spectral resolution of the CDSIS for the orthographic view spectral image is higher than 0.05 nm at 450 nm and 0.07 nm at 550 nm together with 0.1 nm at 700 nm. Figures 6-8 are the simulation results by MATLAB software.
4. Conclusion
A three-area-array coherent-dispersion stereo-imaging spectrometer (CDSIS) was presented and preliminary theoretical calculations were given. The CDSIS records three consecutive spectral images of the front view, orthographic view and back view of the scene when it is spatially scanned perpendicular to the entrance slits. The orthographic view image is used to create a two-dimensional orthophoto image. The front view and back view images are used to reconstruct the three-dimensional stereoscopic image. Compared with all existing imaging spectrometers that only acquire two-dimensional spatial information and one-dimensional spectral information, the most prominent and important advantage of the CDSIS is that the CDSIS can acquire both three-dimensional spatial information and one-dimensional spectral information. The CDSIS has another important advantage of providing the high spectral resolution of interferometry, while avoiding the multiplex disadvantage of interferometry for a broadband source in the ultraviolet-visible spectral region. The third advantage of the CDSIS, compared with traditional interferometry, is that the dynamic range requirement of the area-array detector is reduced. There is also a tradeoff for the CDSIS, which mainly includes three aspects: (1) the spatial-time joint scan makes the data collection time long, (2) the presence of a moving part will reduce the stability against various disturbances, and (3) the presence of the entrance slit means the instrument does not benefit from the throughput advantage. The CDSIS is a unique concept that not only acquires three-dimensional spatial information and one-dimensional spectral information, but also performs high spectral resolution measurements of a broadband source in the ultraviolet-visible spectral region. The CDSIS will be suitable for hyperspectral earth remote sensing or space exploration.
Funding
National Natural Science Foundation of China (NSFC), (61605151).
References
1. A. F. H. Goetz, G. Vane, J. E. Solomon, and B. N. Rock, “Imaging spectrometry for Earth remote sensing,” Science 228(4704), 1147–1153 (1985). [CrossRef] [PubMed]
2. A. F. H. Goetz, “Three decades of hyperspectral remote sensing of the Earth: a personal view,” Remote Sens. Environ. 113, S5–S16 (2009). [CrossRef]
3. Q. Li, X. He, Y. Wang, H. Liu, D. Xu, and F. Guo, “Review of spectral imaging technology in biomedical engineering: achievements and challenges,” J. Biomed. Opt. 18(10), 100901 (2013). [CrossRef] [PubMed]
4. C. L. Wyatt, “Infrared spectrometer: liquid-helium-cooled rocketborne circular-variable filter,” Appl. Opt. 14(12), 3086–3091 (1975). [CrossRef] [PubMed]
5. H. R. Morris, C. C. Hoyt, and P. J. Treado, “Imaging spectrometers for fluorescence and Raman microscopy: acousto-optic and liquid crystal tunable filters,” Appl. Spectrosc. 48(7), 857–866 (1994). [CrossRef]
6. S. C. Gebhart, R. C. Thompson, and A. Mahadevan-Jansen, “Liquid-crystal tunable filter spectral imaging for brain tumor demarcation,” Appl. Opt. 46(10), 1896–1910 (2007). [CrossRef] [PubMed]
7. Z. Zheng, G. Yang, H. Li, and X. Liu, “Three-stage Fabry-Perot liquid crystal tunable filter with extended spectral range,” Opt. Express 19(3), 2158–2164 (2011). [CrossRef] [PubMed]
8. M. Abuleil and I. Abdulhalim, “Narrowband multispectral liquid crystal tunable filter,” Opt. Lett. 41(9), 1957–1960 (2016). [CrossRef] [PubMed]
9. D. A. Glenar, J. J. Hillman, B. Saif, and J. Bergstralh, “Acousto-optic imaging spectropolarimetry for remote sensing,” Appl. Opt. 33(31), 7412–7424 (1994). [CrossRef] [PubMed]
10. N. Gupta and V. Voloshinov, “Hyperspectral imager, from ultraviolet to visible, with a KDP acousto-optic tunable filter,” Appl. Opt. 43(13), 2752–2759 (2004). [CrossRef] [PubMed]
11. N. Gupta and V. Voloshinov, “Hyperspectral imaging performance of a TeO2 acousto-optic tunable filter in the ultraviolet region,” Opt. Lett. 30(9), 985–987 (2005). [CrossRef] [PubMed]
12. P. Wang and Z. Zhang, “Double-filtering method based on two acousto-optic tunable filters for hyperspectral imaging application,” Opt. Express 24(9), 9888–9895 (2016). [CrossRef] [PubMed]
13. H. Zhao, Z. Ji, G. Jia, Y. Zhang, Y. Li, and D. Wang, “MWIR thermal imaging spectrometer based on the acousto-optic tunable filter,” Appl. Opt. 56(25), 7269–7276 (2017). [CrossRef] [PubMed]
14. S. Manohar and D. Razansky, “Photoacoustics: a historical review,” Adv. Opt. Photonics 8(4), 586–617 (2016). [CrossRef]
15. O. I. Korablev, D. A. Belyaev, Y. S. Dobrolenskiy, A. Y. Trokhimovskiy, and Y. K. Kalinnikov, “Acousto-optic tunable filter spectrometers in space missions [Invited],” Appl. Opt. 57(10), C103–C119 (2018). [CrossRef] [PubMed]
16. W. M. Porter and H. T. Enmark, “A System Overview of the Airborne Visible/Infrared Imaging Spectrometer (Aviris),” Proc. SPIE 834, 22–31 (1987). [CrossRef]
17. R. W. Basedow, D. C. Carmer, and M. E. Anderson, “HYDICE system: implementation and performance,” Proc. SPIE 2480, 258–267 (1995). [CrossRef]
18. R. D. Swift, R. B. Wattson, J. A. Decker Jr., R. Paganetti, and M. Harwit, “Hadamard transform imager and imaging spectrometer,” Appl. Opt. 15(6), 1595–1609 (1976). [CrossRef] [PubMed]
19. W. H. Steel, Interferometry (Cambridge University, 1983).
20. J. Kauppinen and V. M. Horneman, “Large aperture cube corner interferometer with a resolution of 0.001 cm(-1).,” Appl. Opt. 30(18), 2575–2578 (1991). [CrossRef] [PubMed]
21. C. L. Bennett, M. R. Carter, D. J. Fields, and J. A. M. Hernandez, “Imaging Fourier transform spectrometer,” Proc. SPIE 1937, 191–200 (1993). [CrossRef]
22. G. Durry and G. Guelachvili, “High-information time-resolved step-scan Fourier interferometer,” Appl. Opt. 34(12), 1971–1981 (1995). [CrossRef] [PubMed]
23. C. M. Snively, S. Katzenberger, G. Oskarsdottir, and J. Lauterbach, “Fourier-transform infrared imaging using a rapid-scan spectrometer,” Opt. Lett. 24(24), 1841–1843 (1999). [CrossRef] [PubMed]
24. C. Zhang, B. Xiangli, B. Zhao, and X. Yuan, “A static polarization imaging spectrometer based on a Savart polariscope,” Opt. Commun. 203(1), 21–26 (2002). [CrossRef]
25. J. Kauppinen, J. Heinonen, and I. Kauppinen, “Interferometers Based on the Rotational Motion,” Appl. Spectrosc. Rev. 39(1), 99–130 (2004). [CrossRef]
26. R. K. Chan, P. K. Lim, X. Wang, and M. H. Chan, “Fourier transform ultraviolet-visible spectrometer based on a beam-folding technique,” Opt. Lett. 31(7), 903–905 (2006). [CrossRef] [PubMed]
27. P. R. Griffiths and J. A. de Haseth, Fourier Transform Infrared Spectrometry (Wiley-Interscience, 2007).
28. Q. Yang, “Moving corner-cube mirror interferometer and reflection characteristic of corner-cube mirror,” Appl. Opt. 49(21), 4088–4095 (2010). [CrossRef] [PubMed]
29. Y. Ferrec, J. Taboury, H. Sauer, P. Chavel, P. Fournet, C. Coudrain, J. Deschamps, and J. Primot, “Experimental results from an airborne static Fourier transform imaging spectrometer,” Appl. Opt. 50(30), 5894–5904 (2011). [CrossRef] [PubMed]
30. Q. Yang, B. Zhao, and D. Wen, “Principle and analysis of a moving double-sided mirror interferometer,” Opt. Laser Technol. 44(5), 1256–1260 (2012). [CrossRef]
31. Q. Yang, L. Liu, and P. Lv, “Principle of a two-output-difference interferometer for removing the most important interference distortions,” J. Mod. Opt. 65(19), 2234–2242 (2018). [CrossRef]
32. M. W. Kudenov and E. L. Dereniak, “Compact real-time birefringent imaging spectrometer,” Opt. Express 20(16), 17973–17986 (2012). [CrossRef] [PubMed]
33. S. Pacheco and R. Liang, “Snapshot, reconfigurable multispectral and multi-polarization telecentric imaging system,” Opt. Express 22(13), 16377–16385 (2014). [CrossRef] [PubMed]
34. P. Wang and R. Menon, “Computational multispectral video imaging [Invited],” J. Opt. Soc. Am. A 35(1), 189–199 (2018). [CrossRef] [PubMed]
35. T. Okamoto and I. Yamaguchi, “Simultaneous acquisition of spectral image information,” Opt. Lett. 16(16), 1277–1279 (1991). [CrossRef] [PubMed]
36. T. Okamoto, A. Takahashi, and I. Yamaguchi, “Simultaneous Acquisition of Spectral and Spatial Intensity Distribution,” Appl. Spectrosc. 47(8), 1198–1202 (1993). [CrossRef]
37. M. Descour and E. Dereniak, “Computed-tomography imaging spectrometer: experimental calibration and reconstruction results,” Appl. Opt. 34(22), 4817–4826 (1995). [CrossRef] [PubMed]
38. M. R. Descour, C. E. Volin, E. L. Dereniak, K. J. Thome, A. B. Schumacher, D. W. Wilson, and P. D. Maker, “Demonstration of a high-speed nonscanning imaging spectrometer,” Opt. Lett. 22(16), 1271–1273 (1997). [CrossRef] [PubMed]
39. J. M. Mooney, V. E. Vickers, M. An, and A. K. Brodzik, “High-throughput hyperspectral infrared camera,” J. Opt. Soc. Am. A 14(11), 2951–2961 (1997). [CrossRef]
40. F. D. Shepherd, J. M. Mooney, T. E. Reeves, P. Dumont, M. M. Weeks, and S. DiSalvo, “Adaptive MWIR spectral imaging sensor,” Proc. SPIE 7055, 705506 (2008). [CrossRef]
41. W. Bao, Z. Ding, P. Li, Z. Chen, Y. Shen, and C. Wang, “Orthogonal dispersive spectral-domain optical coherence tomography,” Opt. Express 22(8), 10081–10090 (2014). [CrossRef] [PubMed]
42. A. A. Wagadarikar, N. P. Pitsianis, X. Sun, and D. J. Brady, “Video rate spectral imaging using a coded aperture snapshot spectral imager,” Opt. Express 17(8), 6368–6388 (2009). [CrossRef] [PubMed]
43. Y. Wu, I. O. Mirza, G. R. Arce, and D. W. Prather, “Development of a digital-micromirror-device-based multishot snapshot spectral imaging system,” Opt. Lett. 36(14), 2692–2694 (2011). [CrossRef] [PubMed]
44. Y. Murakami, M. Yamaguchi, and N. Ohyama, “Hybrid-resolution multispectral imaging using color filter array,” Opt. Express 20(7), 7173–7183 (2012). [CrossRef] [PubMed]
45. N. Hagen and M. W. Kudenov, “Review of snapshot spectral imaging technologies,” Opt. Eng. 52(9), 090901 (2013). [CrossRef]
46. Q. Yang, “Static broadband snapshot imaging spectrometer,” Opt. Eng. 52(5), 053003 (2013). [CrossRef]
47. Q. Yang, “Compact static infrared broadband snapshot imaging spectrometer,” Chin. Opt. Lett. 12(3), 031201 (2014). [CrossRef]
48. Y. Murakami, K. Nakazaki, and M. Yamaguchi, “Hybrid-resolution spectral video system using low-resolution spectral sensor,” Opt. Express 22(17), 20311–20325 (2014). [CrossRef] [PubMed]
49. H. Rueda, D. Lau, and G. R. Arce, “Multi-spectral compressive snapshot imaging using RGB image sensors,” Opt. Express 23(9), 12207–12221 (2015). [CrossRef] [PubMed]
50. Q. Yang, “Broadband high-spectral-resolution ultraviolet-visible coherent-dispersion imaging spectrometer,” Opt. Express 26(16), 20777–20791 (2018). [CrossRef] [PubMed]
51. P. Connes and G. Michel, “Astronomical Fourier spectrometer,” Appl. Opt. 14(9), 2067–2084 (1975). [CrossRef] [PubMed]
52. T. Hirschfeld, “Fellgett’s Advantage in UV-VIS Multiplex Spectroscopy,” Appl. Spectrosc. 30(1), 68–69 (1976). [CrossRef]
53. P. Luc and S. Gerstenkorn, “Fourier transform spectroscopy in the visible and ultraviolet range,” Appl. Opt. 17(9), 1327–1331 (1978). [CrossRef] [PubMed]
54. E. Voigtman and J. D. Winefordner, “The multiplex disadvantage and excess low-frequency noise,” Appl. Spectrosc. 41(7), 1182–1184 (1987). [CrossRef]
55. L. W. Schumann and T. S. Lomheim, “Infrared hyperspectral imaging Fourier transform and dispersive spectrometers: comparison of signal-to-noise based performance [Invited],” Proc. SPIE 4480, 1–14 (2002). [CrossRef]
56. P. B. Fellgett, “The nature and origin of multiplex Fourier spectrometry,” Notes Rec. R. Soc. 60(1), 91–93 (2006). [CrossRef]
57. A. Barducci, D. Guzzi, C. Lastri, V. Nardino, P. Marcoionni, and I. Pippi, “Radiometric and signal-to-noise ratio properties of multiplex dispersive spectrometry,” Appl. Opt. 49(28), 5366–5373 (2010). [CrossRef] [PubMed]
58. A. Barducci, D. Guzzi, C. Lastri, P. Marcoionni, V. Nardino, and I. Pippi, “Theoretical aspects of Fourier Transform Spectrometry and common path triangular interferometers,” Opt. Express 18(11), 11622–11649 (2010). [CrossRef] [PubMed]
59. Q. Yang, “Coherent-dispersion spectrometer for the ultraviolet and visible regions,” Opt. Express 26(10), 12372–12386 (2018). [CrossRef] [PubMed]
60. Q. Yang, “Ultrahigh-resolution rapid-scan ultraviolet-visible spectrometer,” OSA Continuum 1(3), 812–821 (2018). [CrossRef]
61. T. Okoshi, Three-Dimensional Imaging Techniques (Academic, 1976).
62. B. Javidi, F. Okano, and J. Y. Son, Three-Dimensional Imaging, Visualization, and Display (Springer, 2009).
63. N. T. Shaked, B. Katz, and J. Rosen, “Review of three-dimensional holographic imaging by multiple-viewpoint-projection based methods,” Appl. Opt. 48(34), H120–H136 (2009). [CrossRef] [PubMed]
64. X. Xiao, B. Javidi, M. Martinez-Corral, and A. Stern, “Advances in three-dimensional integral imaging: sensing, display, and applications [Invited],” Appl. Opt. 52(4), 546–560 (2013). [CrossRef] [PubMed]
65. M. A. Preciado, G. Carles, and A. R. Harvey, “Video-rate computational super-resolution and integral imaging at longwave-infrared wavelengths,” OSA Continuum 1(1), 170–180 (2018). [CrossRef]
66. S. Komatsu, A. Markman, A. Mahalanobis, K. Chen, and B. Javidi, “Three-dimensional integral imaging and object detection using long-wave infrared imaging,” Appl. Opt. 56(9), D120–D126 (2017). [CrossRef] [PubMed]
67. A. Markman, X. Shen, and B. Javidi, “Three-dimensional object visualization and detection in low light illumination using integral imaging,” Opt. Lett. 42(16), 3068–3071 (2017). [CrossRef] [PubMed]
68. M. Martínez-Corral and B. Javidi, “Fundamentals of 3D imaging and displays: a tutorial on integral imaging, light-field, and plenoptic systems,” Adv. Opt. Photonics 10(3), 512–566 (2018). [CrossRef]
69. J. V. Sweedler, R. D. Jalkian, G. R. Sims, and M. B. Denton, “Crossed Interferometric Dispersive Spectroscopy,” Appl. Spectrosc. 44(1), 14–20 (1990). [CrossRef]
70. J. V. Sweedler, “The use of charge transfer device detectors and spatial interferometry for analytical spectroscopy,” https://arizona.openrepository.com/handle/10150/184683.
71. C. Palmer and E. Loewen, Diffraction Grating Handbook (Newport Corporation, 2005).
72. Q. Yang and W. Wang, “Compact orthogonal-dispersion device using a prism and a transmission grating,” J. Eur. Opt. Soc. 14(1), 8 (2018). [CrossRef]
73. T. Huen, “Reflectance of thinly oxidized silicon at normal incidence,” Appl. Opt. 18(12), 1927–1932 (1979). [CrossRef] [PubMed]
74. K. Xu, “Monolithically integrated Si gate-controlled light-emitting device: science and properties,” J. Opt. 20(2), 024014 (2018). [CrossRef]
75. E. V. Chandler, C. G. Durfee, and J. A. Squier, “Integrated spectrometer design with application to multiphoton microscopy,” Opt. Express 19(1), 118–127 (2011). [CrossRef] [PubMed]
76. M. Bass, G. Li, and E. V. Stryland, Handbook of Optics (McGraw-Hill, 2010).