Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Three-area-array coherent-dispersion stereo-imaging spectrometer

Open Access Open Access

Abstract

A coherent-dispersion stereo-imaging spectrometer is presented, which combines three-view stereo imaging, interferometric spectroscopy and dispersive spectroscopy. Three area-array detectors record three spectral images of each scene unit from three views. For each of the three views, each scene unit is imaged on a given column of one area-array detector, and different wavelengths are dispersed across different rows of that column. For each scene unit, multiple interferograms are simultaneously generated at each view, each interferogram covering a separate wavelength range and located in a separate pixel. The orthographic view image is used to create a two-dimensional orthophoto image. The front view and back view images are used to reconstruct the three-dimensional stereoscopic image. Preliminary theoretical calculations are given. The instrument is a unique concept to obtain three-dimensional spatial information and one-dimensional spectral information while achieving high spectral resolution measurement of an ultraviolet-visible broadband spectral range (e.g., 0.05 nm at 450 nm together with 0.1 nm at 700 nm). It will be suitable for ultraviolet-visible hyperspectral remote sensing.

© 2019 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

1. Introduction

All types of existing imaging spectrometers acquire two-dimensional spatial information and one-dimensional spectral information (i.e., a three-dimensional data cube) of an object or a scene [1–3]. The spectral image can be obtained by the following methods: (1) Wavelength-scan methods that measure the images one wavelength at a time. Typical representative of this method is color filter imaging spectrometer that mainly uses either a circular-variable filter [4], a liquid-crystal tunable filter [5–8] or an acousto-optical tunable filter [9–15] to capture a full spectral image by measuring one image at a time but each time at a different wavelength. (2) Spatial-scan methods that measure the whole spectrum of a portion of the image at a time and scan the image (e.g., line by line). Typical representative of this method is dispersive imaging spectrometer that uses a prism, a grating, or both to achieve the dispersion of light and to collect the entire spectral image by measuring only one line of the object at a time and by scanning the object line-by-line (e.g., whisk-broom scanning [16], push-broom scanning [17]). (3) Time-scan methods that measure a set of images where each one of them is a superposition of spectral or spatial image information and therefore require a mathematical transformation of the acquired data to derive the spectral image (e.g., by Fourier transform or Hadamard transform [18]). Typical representative of this method is interferometric imaging spectrometer (i.e., Fourier transform imaging spectrometer) that is based on temporal interferometry or spatial interferometry [19–34]. (4) Compromise methods that measure the whole spectral image simultaneously, but compromise on the number of points in the spectrum, the field of view (FOV) or spatial resolution. Typical representatives of this method are computed-tomography imaging spectrometer [35–41] and snapshot spectral imagers [42–49]. (5) Spatial-Time joint scan method that uses a time-scan method to measure the spectral image of each line of the object and also uses the spatial-scan method to obtain the entire spectral image by scanning the object line-by-line (i.e., push-broom scanning). Typical representative of this method is coherent-dispersion imaging spectrometer [50].

When measuring an infrared broad-band source or an ultraviolet-visible narrow-band source, interferometric spectrometry (especially temporal interferometry based on linear scanning, such as Michelson-type interferometer) can perform high spectral resolution measurements. Nevertheless, when measuring an ultraviolet-visible broad-band source, the interferometric spectrometry performance is weakened [27,51–60]. That is, interferometric spectrometry is only suitable for low and medium spectral resolution measurements of a broad-band source in the ultraviolet-visible spectral region. Interestingly, the coherent-dispersion spectrometry is an effective and unique method to perform high spectral resolution measurements of a broadband source in the ultraviolet-visible spectral region [50,59, 60].

On the other hand, three-dimensional imaging technologies can potentially capture the three-dimensional structure, range, and texture information of an object. According to the characteristics of the reconstructed images, three-dimensional imaging technologies can be roughly classified into four categories: binocular, multi-view, volumetric imaging, and spatial imaging [61,62]. Typical representatives of spatial imaging are holographic imaging and integral imaging [63–68]. It is a pity that all existing imaging spectrometers are based on two-dimensional imaging technologies.

This paper presents a coherent-dispersion stereo-imaging spectrometer, which can not only obtain three-dimensional stereoscopic image, but also perform high spectral resolution measurements of a broadband source in the ultraviolet-visible spectral region. After a detailed description of the principle, preliminary numerical calculations are illustrated by an example for the spectral range from 400 nm to 700 nm. Finally, the conclusion is given.

2. Principle

Figure 1 shows the equivalent orthographic view for the optics of the three-area-array coherent-dispersion stereo-imaging spectrometer (CDSIS), and Fig. 2 shows the equivalent light path diagram of the CDSIS. The CDSIS consists of a moving corner-cube-mirror interferometer, a plane transmission grating, an objective lens, a collimating lens, a collecting lens, three entrance slits, and three area-array detectors. The moving corner-cube-mirror interferometer comprises one moving corner-cube mirror (CCM), one fixed CCM, and one beam splitter. The optical path difference is created by the linear scanning of the moving CCM driven by a linear actuator. The interferometer has neither tilt nor shearing problems. The back focal plane of the objective lens is coincident with the front focal plane of the collimating lens. Three entrance slits (S1, S2 and S3) are located at the front focal plane of the collimating lens (i.e., back focal plane of the objective lens) and are parallel to each other. The entrance slit S2 is located in the center of the field of view of the optical system, so it intersects the optical axis of the optical system. The entrance slits S1 and S3 are symmetric about the entrance slit S2. The three entrance slits are perpendicular to the flight direction of the instrument platform (e.g., satellite), but parallel to the linear scanning direction of the moving CCM. Three area-array detectors (A, B and C) are located at the back focal plane of the collecting lens. The area-array detector B is located in the center of the field of view of the optical system, so it receives the orthographic view light passing through the entrance slit S2. The area-array detector A receives the front view light passing through the entrance slit S1. The area-array detector C receives the back view light passing through the entrance slit S3. In one scan period of the moving CCM, the CDSIS simultaneously records the spectral images of three scan lines on the scene: the detector A records the front view spectral image of one scan line on the scene, the detector B records the orthographic view spectral image of another scan line on the scene, and the detector C records the back view spectral image of another scan line on the scene. When the CDSIS is spatially scanned perpendicular to the entrance slits: the detector A records the front view consecutive spectral image of the scene, the detector B records the orthographic view consecutive spectral image of the scene, and the detector C records the back view consecutive spectral image of the scene. Namely, the CDSIS records three consecutive spectral images of front view, orthographic view and back view of the scene when it is spatially scanned perpendicular to the entrance slits.

 figure: Fig. 1

Fig. 1 Equivalent orthographic view for the optics of the three-area-array coherent-dispersion stereo-imaging spectrometer (CDSIS). ZPD: zero path difference.

Download Full Size | PDF

 figure: Fig. 2

Fig. 2 Equivalent light path diagram of the three-area-array CDSIS. A, B and C: area-array detectors located in the back focal plane of the collecting lens. S1, S2 and S3: entrance slits located at the front focal plane of the collimating lens (back focal plane of the objective lens).

Download Full Size | PDF

The CDSIS combines interferometric spectroscopy with dispersive spectroscopy to get spectral information of an object or a scene. The CDSIS integrates a moving corner-cube-mirror interferometer and a plane transmission grating to obtain spectral information. For each unit of the scene, the CDSIS simultaneously produces multiple interferograms in one scan period of the moving CCM, each interferogram with a separate wavelength range and located in a separate pixel of a given column of one area-array detector. More specifically, for each scene unit, not only the transmission grating spreads the spectral information onto a given column of one area-array detector, but also the interferometer simultaneously generates multiple interferograms in one scan period of the moving CCM, each interferogram covering a separate wavelength range controlled by the transmission grating and detector geometry, each interferogram located in a separate pixel of the given column above. In other words, the transmission grating divides a broad-band spectrum into a number of narrow-band spectra, each narrow-band spectrum is located on a separate pixel of a given column of the detector, and the interferometer simultaneously produces multiple interferograms for the broad-band spectrum in one scan period of the moving CCM, each interferogram corresponds to a different narrow-band spectrum. Thus any noise present in an optical signal is limited to the pixels where this noisy signal impinges on, and these noise signals have no effect on the interferograms with different narrow-band spectra. Moreover, the center fringes are spread into multiple separate pixels (interferograms), each with a separate narrow-band spectrum, thus the dynamic range requirement of the detector is reduced [69,70]. Therefore, the CDSIS has the advantage of providing the high spectral resolution of interferometry, while avoiding the multiplex disadvantage of interferometry for a broadband source in the ultraviolet-visible spectral region, and reducing the dynamic range requirement of the detector. The CDSIS can perform high spectral resolution measurements of a broadband source in the ultraviolet-visible spectral region.

Throughout this paper, rows of each area-array detector are parallel to the y-axis of the detector plane, columns of each area-array detector are parallel to x-axis of the detector plane, λo is the center wavelength of a source spectra covering a wavelength range from λ1 to λk (i.e., a wavenumber range from σmin=1/λk to σmax=1/λ1 if λ1<λk), l is the displacement of the moving CCM from the zero path difference (ZPD) position, f is the focal length of the objective lens, f1 is the focal length of the collimating lens, and f2 is the focal length of the collecting lens.

In Fig. 1, typical orthographic view light rays emerging from two representative object points on a scan line (in Y-axis direction of the object coordinate system O-XYZ) of the scene are drawn. One point Pi(0,0,Zi) is located on the optical axis of the objective lens. The second point Pj(0,Y,Zj) is displaced by a distance Y from the optical axis of the objective lens. Moreover, six representative image points are shown: Pi(0,0) is the image point of object point Pi(0,0,Zi) on the back focal plane of the objective lens, Pj(0,y) is the image point of object point Pj(0,Y,Zj) on the back focal plane of the objective lens, Pi1(xB1,0) is the image point on area-array detector B of Pi(0,0,Zi) formed by wavelength λ1, Pik(xBk,0) is the image point on area-array detector B of Pi(0,0,Zi) formed by wavelength λk, Pj1(xB1,y) is the image point on area-array detector B of Pj(0,Y,Zj) formed by wavelength λ1, Pjk(xBk,y) is the image point on area-array detector B of Pj(0,Y,Zj) formed by wavelength λk. It can be obtained that

tanθ=yf,
tanη=yf1=yf2.

In Fig. 2, typical front view, orthographic view and back view light rays are received by the area-array detectors A, B and C, respectively. xA1, xAo and xAk are the x-axis coordinates on the detector plane for the wavelengths λ1, λo and λk of the front view light. xB1, xBo and xBk are the x-axis coordinates on the detector plane for the wavelengths λ1, λo and λk of the orthographic view light. xC1, xCo and xCk are the x-axis coordinates on the detector plane for the wavelengths λ1, λo and λk of the back view light. xA is the x-axis coordinate of the entrance slit S1 on the back focal plane of the objective lens, xB is the x-axis coordinate of the entrance slit S2 on the back focal plane of the objective lens, and xC is the x-axis coordinate of the entrance slit S3 on the back focal plane of the objective lens. ψ is the angle between the grating normal and the optical axis of the optical system. xB=0, xAxB=xBxC, so xA=xC. It can be obtained that

tanϕ0=xAxBf=xAf,
tanρ=xAxBf1=xAf1=ff1tanϕ0.

Figure 3 shows the equivalent light path diagram from the transmission grating to the detector plane for the orthographic view light. The grating equation for a plane transmission grating is given by [71]

mλi=g[n(λi)sinα+sinθm(λi)].
where m is the diffraction order that is an integer, λi is the wavelength of light, g is the groove spacing of grating, n(λi) is the refractive index of the plane transmission grating for wavelength λi, α is the incidence angle measured from the grating normal, θm(λi) is the m-order diffraction angle measured from the grating normal for wavelength λi.

 figure: Fig. 3

Fig. 3 Equivalent light path diagram from grating to detector for the orthographic view light.

Download Full Size | PDF

Let the optical axis of the collecting lens overlap with the 1-order diffracted light ray from the grating of the center wavelength λo of the orthographic view light, thus the x-axis coordinate on the detector plane of the center wavelength λo of the orthographic view light is xBo=0.

According to the grating equation, the characteristics of the lens and the geometry, it can be obtained that

λ1=g[n(λ1)sinψ+sinθ1(λ1)],
λo=g[n(λo)sinψ+sinθ1(λo)],
λk=g[n(λk)sinψ+sinθ1(λk)],
tanβ(λ1)=tan[θ1(λo)θ1(λ1)]=xB1f2,
tanβ(λk)=tan[θ1(λk)θ1(λo)]=xBkf2.
From Eqs. (6)-(10), it can be obtained that
xB1=f2τ(λo)1τ2(λ1)τ(λ1)1τ2(λo)1τ2(λo)1τ2(λ1)+τ(λo)τ(λ1),
xBk=f2τ(λo)1τ2(λk)τ(λk)1τ2(λo)1τ2(λo)1τ2(λk)+τ(λo)τ(λk),
where

τ(λ)=λgn(λ)sinψ.

For the front view light, the incidence angle measured from the grating normal is α=ψρ (see Fig. 2), it can be obtained that

xA1=f2τ(λo)1ς2(λ1)ς(λ1)1τ2(λo)1τ2(λo)1ς2(λ1)+τ(λo)ς(λ1),
xAo=f2τ(λo)1ς2(λo)ς(λo)1τ2(λo)1τ2(λo)1ς2(λo)+τ(λo)ς(λo),
xAk=f2τ(λo)1ς2(λk)ς(λk)1τ2(λo)1τ2(λo)1ς2(λk)+τ(λo)ς(λk),
where

ς(λ)=λgn(λ)sin(ψρ)=λgn(λ)sin(ψarctanxAf1).

For the back view light, the incidence angle measured from the grating normal is α=ψ+ρ (see Fig. 2), it can be obtained that

xC1=f2τ(λo)1ξ2(λ1)ξ(λ1)1τ2(λo)1τ2(λo)1ξ2(λ1)+τ(λo)ξ(λ1),
xCo=f2τ(λo)1ξ2(λo)ξ(λo)1τ2(λo)1τ2(λo)1ξ2(λo)+τ(λo)ξ(λo),
xCk=f2τ(λo)1ξ2(λk)ξ(λk)1τ2(λo)1τ2(λo)1ξ2(λk)+τ(λo)ξ(λk),
where
ξ(λ)=λgn(λ)sin(ψ+ρ)=λgn(λ)sin(ψ+arctanxAf1).
According to Eq. (2), it can be obtained that

y=f2f1y.

The CDSIS is based on three-view stereo imaging to obtain three-dimensional spatial information (i.e., three-dimensional stereoscopic image) of a scene or an object. The CDSIS images each scene unit on a separate column of one area-array detector at each of the three views. More specifically, for each of the three views, each scene unit is imaged on a given column of one area-array detector, and different wavelengths are dispersed across different rows of that column. When the CDSIS is spatially scanned perpendicular to the entrance slits, it records three images of front view, orthographic view and back view of each scene unit. Figure 4 shows the equivalent schematic diagram of the three-view stereo imaging for the CDSIS. The instrument platform (e.g., satellite) flies from left to right along the X-axis direction of the object coordinate system O-XYZ. At time tA, an object point Pi(X,Y,Z) on a scan line (i.e., green thick line parallel to the Y-axis direction in Fig. 4) of the scene is imaged on the entrance slit S1, and the coordinates of this front view image point are (xA,yA). At time tB, the same object point Pi(X,Y,Z) is imaged on the entrance slit S2, and the coordinates of this orthographic view image point are (xB,yB). At time tC, the same object point Pi(X,Y,Z) is imaged on the entrance slit S3, and the coordinates of this back view image point are (xC,yC). With the flight of the instrument platform (e.g., satellite), three consecutive two-dimensional track images of front view, orthographic view and back view of the scene are recorded, respectively, by three area-array detectors (A, B and C). The orthographic view image is used to create a two-dimensional orthophoto image. The front view and back view images are used to reconstruct the three-dimensional stereoscopic image.

 figure: Fig. 4

Fig. 4 Equivalent schematic diagram of the three-view stereo imaging for the CDSIS.

Download Full Size | PDF

H is the flight height of the instrument platform (e.g., satellite), ϕ0 is the angle between the front view axis (back view axis) and the orthographic view axis, and d is the ground distance of the scene corresponding to the front view and the orthographic view (i.e., the instantaneous imaging half width of the scene in the flight direction). It can be obtained that

tanθ=YH=yf,
tanϕ0=dH=xAf.

The relationship between the coordinates of the image point on the back focal plane of the objective lens and the coordinates of the object point on the scene can be expressed by a matrix of image construction, if they are both defined in the object coordinate system O-XYZ. The matrix of image construction can be written as

[cosϕ00sinϕ0010sinϕ00cosϕ0][0yf]=[fsinϕ0yfcosϕ0]=γRT[XXSYYSZZS].
where γ is a scale factor, (XS,YS,ZS) is the coordinates of the instrument platform (e.g., satellite) in the object coordinate system O-XYZ, and RT is a rotation matrix consisting of the cosine of the azimuth angle of the three axes of the instrument platform (e.g., satellite), where each axis has three azimuth angles φ, ω and κ. Let ak,bk,ck(k=1,2,3) denote the direction cosines of the three axes of the instrument platform (e.g., satellite), the matrix R is given by
R=[a1a2a3b1b2b3c1c2c3],
where

a1=cosφcosκsinφsinωsinκa2=cosφsinκsinφsinωcosκa3=sinφcosωb1=cosωsinκb2=cosωcosκb3=sinωc1=sinφcosκ+cosφsinωsinκc2=sinφsinκ+cosφsinωcosκc3=cosφcosω}.

Solving the matrix of image construction, the collinearity equations can be given by

x=fa1(XXS)+b1(YYS)+c1(ZZS)a3(XXS)+b3(YYS)+c3(ZZS),
y=fa2(XXS)+b2(YYS)+c2(ZZS)a3(XXS)+b3(YYS)+c3(ZZS),
X=[a1x+a2ya3fc1x+c2yc3f](ZZS)+XS,
Y=[b1x+b2yb3fc1x+c2yc3f](ZZS)+YS.

According to Eq. (28) and Eq. (29) together with Eq. (13), Eq. (15), Eq. (17), Eq. (19), Eq. (21) and Eq. (22), for the front view and back view of an object point Pi(X,Y,Z) on the scene, we can get four equations and use the least square method to solve three unknowns X, Y and Z.

Assume that the pixel size of each area-array detector is b, and the number of pixels in each row (in y-axis direction) of each area-array detector is N. Based on Eqs. (22) and (23), the ground sample distance (GSD) across the flight direction (i.e., the spatial resolution of the CDSIS in Y-axis direction) is given by

GSD=Hbf1ff2,
the field of view (FOV) across the flight direction is given by
FOVY=2arctan(θmax)=2arctan(Nbf12ff2.),
the ground imaging width across the flight direction is
W=NGSD=HNbf1ff2,
and the length of the entrance slit is

LS=Nbf1f2.

The elevation uncertainty in Z-axis direction is given by

ΔZ=k×GSD2d/H.
where k is a coefficient that is related to the registration accuracy of the front view and back view images when reconstructing the stereoscopic image.

According to Eq. (11) and Eq. (12), the size of each column (in x-axis direction) of the area-array detector B must be greater than |xB1xBk|, and the number of pixels in each column of the area-array detector B used to acquire multiple interferograms should be

MB|xB1xBk|/b.

According to Eq. (14) and Eq. (16), the size of each column (in x-axis direction) of the area-array detector A must be greater than |xA1xAk|, and the number of pixels in each column of the area-array detector A used to acquire multiple interferograms should be

MA|xA1xAk|/b.

According to Eq. (18) and Eq. (20), the size of each column (in x-axis direction) of the area-array detector C must be greater than |xC1xCk|, and the number of pixels in each column of the area-array detector C used to acquire multiple interferograms should be

MC|xC1xCk|/b.
Namely, for each scene unit, MB separate interferograms are simultaneously generated and recorded by a separate column of the detector B, MA separate interferograms are simultaneously generated and recorded by a separate column of the detector A, and MC separate interferograms are simultaneously generated and recorded by a separate column of the detector C. Furthermore, for each scene unit, the interferogram q recorded by the row q of a separate column of the detector B covers a wavelength range from λq1 to λqn, which is determined by |xBq1xBqn|b. xBq1 and xBqn are the x-axis coordinates on the detector plane for the wavelengths λq1 and λqn of the orthographic view light.

The optical path difference (OPD) for each interferogram recorded by area-array detector B and generated by the light emerging from the object point Pj(0,Y,Zj) is given by [21,28]

OPDB(l,y)=2lcosη=2l1+tan2η=2l1+(yf2)2.

The interferogram IB(l,y) recorded by area-array detector B and generated by the light emerging from the object point Pj(0,Y,Zj) is given by

IB(l,y)=0B(σ)[1+cos(4πσl1+(yf2)2)]dσ.
where σ is the wave number, B(σ) is the input spectral intensity at a wavenumber σ, l is the displacement of the moving CCM from the zero path difference (ZPD) position. For the object point Pi(0,0,Zi), y=0, the optical path difference simplifies to be OPDB(l,0)=2l, and Eq. (41) simplifies to be IB(l,0)=0B(σ)[1+cos(4πσl)]dσ.

The OPD for each interferogram recorded by area-array detector A (or C) and generated by the light emerging from the object point Pk(d,Y,Zk) is given by

OPDA(l,y)=2(lcosρ)2+(ltanη)2=2l1+(ff1tanϕ0)2+(yf2)2.

The interferogram IA(l,y) recorded by area-array detector A (or C) and generated by the light emerging from the object point Pk(d,Y,Zk) of the scene is given by

IA(l,y)=0B(σ)[1+cos(4πσl1+(ff1tanϕ0)2+(yf2)2)]dσ.

The theoretical spectral resolution δσB for the orthographic view spectral image of the three-area-array CDSIS can be calculated by

δσB=12OPDBmax(l,0)=14lmax.
where lmax is the maximum displacement of the moving CCM from the ZPD position. Furthermore, in order to improve the signal-to-noise ratio of the recovered spectrum, the truncation and apodization (e.g., the triangular function) of the interferogram will also reduce the spectral resolution. In practice, the designed spectral resolution for the orthographic view spectral image of the CDSIS can be given by

δσB=12lmax.

The theoretical spectral resolution δσA for the front view (or back view) spectral image of the three-area-array CDSIS can be calculated by

δσA=12OPDAmax(l,0)=f14lmaxf12+(ftanϕ0)2.

In practice, the designed spectral resolution for the front view (or back view) spectral image of the CDSIS can be given by

δσA=f12lmaxf12+(ftanϕ0)2.

For a source spectra covering a wavelength range from λ1 to λk, if λ1<λk, the maximum wave number of the source spectra is σmax=1/λ1. According to Nyquist criterion, for convenience, the sampling interval for each interferogram produced by the CDSIS can be χ1/2σmax=λ1/2. The number of sampling points for each unilateral interferogram recorded by the area-array detector B and generated by the light emerging from the object point Pj(0,Y,Zj) can be given by

KB(y)=OPDBmax(l,y)χ=4σmaxlmax1+(yf2)2,
and the number of sampling points for each unilateral interferogram recorded by the area-array detector A (or C) and generated by the light emerging from the object point Pk(d,Y,Zk) can be given by
KA(y)=OPDAmax(l,y)χ=4σmaxlmax1+(ff1tanϕ0)2+(yf2)2.
Therefore, the resolving power for each interferogram recorded by the area-array detector B and generated by the light emerging from the object point Pj(0,Y,Zj) is determined by RB=σmax/δσB=KB(y), and the resolving power for each interferogram recorded by the area-array detector A (or C) and generated by the light emerging from the object point Pk(d,Y,Zk) is determined by RA=σmax/δσA=KA(y). That is, the resolving power is determined by the number of sampling points for each unilateral interferogram.

Table 1 shows the comparisons of the three-area-array CDSIS, the coherent-dispersion imaging spectrometer (CDIS) in [50] and the coherent-dispersion spectrometer (CDS) in [59]. Table 2 shows the comparisons of the CDSIS, the interferometric imaging spectrometer, the dispersive imaging spectrometer, the color filter imaging spectrometer, and the snapshot spectral imager. Table 3 shows the comparisons of the CDSIS, the broadband snapshot imaging spectrometer (BSIS) in [46], the infrared broadband snapshot imaging spectrometer (IBSIS) in [47], and the orthogonal-dispersion device in [72].

Tables Icon

Table 1. Comparisons of the CDSIS, the CDIS in [50], the CDS in [59], and the UVS in [60]

Tables Icon

Table 2. Comparisons of the CDSIS, interferometric, dispersive, and color filter imaging spectrometers

Tables Icon

Table 3. Comparisons of the CDSIS, the BSIS in [46], the IBSIS in [47], and the device in [72]

Both the Andor iDus CCD and Newton CCD spectroscopy detectors of Oxford Instruments are suitable for the CDSIS. For example, the Andor iDus 420 Series 1024 x 255 pixel Spectroscopy CCD, the Andor Newton 920 Series 1024 x 255 pixel Spectroscopy CCD, the Andor Newton 940 Series 2048 x 512 pixel Spectroscopy CCD, and so on [73,74].

By dispersing the light with a plane transmission grating instead of a prism, mainly because the grating can provide higher and more linear dispersion of the wavelength than the prism [60]. However, the prism possesses a wider spectral range than the grating [75].

3. Preliminary theoretical calculation and numerical simulation

Assume that the source spectrum covers a wavelength range from 400 nm to 700 nm, i.e., a wavenumber range from 14285.7 cm−1 to 25000 cm−1. Table 4 shows the wavelength difference versus wavenumber difference for several wavelengths. If the desired spectral resolution of the CDSIS is 0.05 nm at 450 nm together with 0.1 nm at 700 nm, the designed spectral resolution (in wavenumber) of the CDSIS should be δσ=2cm1. From Eq. (45), the maximum displacement of the moving CCM from the ZPD position should be lmax=1/(2δσB)=2.5mm. Based on Eq. (48), the number of sampling points for each unilateral interferogram recorded by the detector B and generated by the light emerging from the object point Pi(0,0,Zi) is KB(0)=4σmaxlmax=4×25000cm1×2.5mm=25000.

Tables Icon

Table 4. Wavelength difference versus Wavenumber difference for several wavelengths

Suppose that the flight height of the instrument platform (e.g., satellite) is H=200km, the instantaneous imaging half width in the flight direction is d=60km, the ground imaging width across the flight direction is W=120km, the ground sample distance is GSD=120m, the pixel size of each area-array detector is b=0.02mm, the focal length of the objective lens is f=50mm, the focal length of the collecting lens is f2=100mm. Thus the focal length of the collimating lens is f1=150mm, the angle between the front view axis (back view axis) and the orthographic view axis is ϕ016.7°, the field of view across the flight direction is FOVY16.7°, the number of pixels in each row (in y-axis direction) of each area-array detector is N=W/GSD=1000, and the length of the entrance slit is LS=30mm.

Fused silica is a proper material for the plane transmission grating, and its refractive index formula is given by [76]

n2=1+0.6961663λ2λ20.06840432+0.4079426λ2λ20.11624142+0.8974794λ2λ29.8961612.

Assume that the angle between the grating normal and the optical axis of the optical system is ψ=10°, the center wavelength is λo=550nm, and the transmission grating with 150 grooves/mm. According to Eq. (4), Eqs. (11)-(21) and Eq. (50), the x-axis coordinates on the detector plane of the wavelengths from λ1=400nm to λk=700nm for front view light, orthographic view light and back view light are shown in Fig. 5. The number of pixels in each column of area-array detector B is MB|xB1xBk|/b=|2.3136(2.3009)|/0.02=230.725. Namely, for each scene unit, MB=231 separate interferograms are simultaneously generated and recorded by a separate column of the detector B. The number of pixels in each column of area-array detector A should be MA|xA1xAk|/b=|15.1632(19.8078)|/0.02=232.23. The number of pixels in each column (in x-axis direction) of the area-array detector C should be MC|xC1xCk|/b=|20.417615.3769|/0.02=252.035.

 figure: Fig. 5

Fig. 5 The x-axis coordinates on the detector plane of different wavelengths for the front view light, orthographic view light and back view light.

Download Full Size | PDF

According to Eqs. (11)-(13) and Eq. (41), three representative interferograms recorded simultaneously by a separate column of area-array detector B in one scan period of the moving CCM and generated by the light emerging from object point Pi(0,0,Zi) are shown in Fig. 6. The first representative interferogram contains only wavelength 449.95 nm, 450 nm, 450.05 nm and 450.1 nm. The second representative interferogram contains only wavelength 549.93 nm, 550 nm, 550.07 nm and 550.14 nm. The third representative interferogram contains only wavelength 699.7 nm, 699.8 nm, 699.9 nm and 700 nm. Figure 7 shows the spectrum obtained from Fourier transform of the three interferograms shown in Fig. 6 and its distribution in a given column of area-array detector B.

 figure: Fig. 6

Fig. 6 Several interferograms recorded simultaneously by area-array detector B in one scan period of the moving CCM for object point Pi(0,0,Zi) when the spectral resolution is 2 cm−1.

Download Full Size | PDF

 figure: Fig. 7

Fig. 7 Spectrum obtained from Fourier transform of the three interferograms in Fig. 6 and its distribution in a given column of area-array detector B.

Download Full Size | PDF

Figure 8 shows the detailed spectrum obtained from Fourier transform of the three interferograms shown in Fig. 6. In order to better show the detailed spectrum, there are three pictures in Fig. 8: the top picture is obtained from Fourier transform of the first interferogram (in Fig. 6) that contains only wavelength 449.95 nm, 450 nm, 450.05 nm and 450.1 nm; the middle picture is obtained from Fourier transform of the second interferogram (in Fig. 6) that contains only wavelength 549.93 nm, 550 nm, 550.07 nm and 550.14 nm; and the bottom picture is obtained from Fourier transform of the third interferogram (in Fig. 6) that contains only wavelength 699.7 nm, 699.8 nm, 699.9 nm and 700 nm. In the top picture of Fig. 8, from left to right, the four peaks correspond to the spectra of wavelengths 449.95 nm, 450 nm, 450.05 nm and 450.1 nm, respectively. In the middle picture of Fig. 8, from left to right, the four peaks correspond to the spectra of wavelengths 549.93 nm, 550 nm, 550.07 nm and 550.14 nm, respectively. In the bottom picture of Fig. 8, from left to right, the four peaks correspond to the spectra of wavelengths 699.7 nm, 699.8 nm, 699.9 nm and 700 nm, respectively. Twelve spectral peaks are clearly visible. It can be easily obtained that the spectral resolution of the CDSIS for the orthographic view spectral image is higher than 0.05 nm at 450 nm and 0.07 nm at 550 nm together with 0.1 nm at 700 nm. Figures 6-8 are the simulation results by MATLAB software.

 figure: Fig. 8

Fig. 8 The detailed Spectrum obtained from Fourier transform of the three interferograms in Fig. 6.

Download Full Size | PDF

4. Conclusion

A three-area-array coherent-dispersion stereo-imaging spectrometer (CDSIS) was presented and preliminary theoretical calculations were given. The CDSIS records three consecutive spectral images of the front view, orthographic view and back view of the scene when it is spatially scanned perpendicular to the entrance slits. The orthographic view image is used to create a two-dimensional orthophoto image. The front view and back view images are used to reconstruct the three-dimensional stereoscopic image. Compared with all existing imaging spectrometers that only acquire two-dimensional spatial information and one-dimensional spectral information, the most prominent and important advantage of the CDSIS is that the CDSIS can acquire both three-dimensional spatial information and one-dimensional spectral information. The CDSIS has another important advantage of providing the high spectral resolution of interferometry, while avoiding the multiplex disadvantage of interferometry for a broadband source in the ultraviolet-visible spectral region. The third advantage of the CDSIS, compared with traditional interferometry, is that the dynamic range requirement of the area-array detector is reduced. There is also a tradeoff for the CDSIS, which mainly includes three aspects: (1) the spatial-time joint scan makes the data collection time long, (2) the presence of a moving part will reduce the stability against various disturbances, and (3) the presence of the entrance slit means the instrument does not benefit from the throughput advantage. The CDSIS is a unique concept that not only acquires three-dimensional spatial information and one-dimensional spectral information, but also performs high spectral resolution measurements of a broadband source in the ultraviolet-visible spectral region. The CDSIS will be suitable for hyperspectral earth remote sensing or space exploration.

Funding

National Natural Science Foundation of China (NSFC), (61605151).

References

1. A. F. H. Goetz, G. Vane, J. E. Solomon, and B. N. Rock, “Imaging spectrometry for Earth remote sensing,” Science 228(4704), 1147–1153 (1985). [CrossRef]   [PubMed]  

2. A. F. H. Goetz, “Three decades of hyperspectral remote sensing of the Earth: a personal view,” Remote Sens. Environ. 113, S5–S16 (2009). [CrossRef]  

3. Q. Li, X. He, Y. Wang, H. Liu, D. Xu, and F. Guo, “Review of spectral imaging technology in biomedical engineering: achievements and challenges,” J. Biomed. Opt. 18(10), 100901 (2013). [CrossRef]   [PubMed]  

4. C. L. Wyatt, “Infrared spectrometer: liquid-helium-cooled rocketborne circular-variable filter,” Appl. Opt. 14(12), 3086–3091 (1975). [CrossRef]   [PubMed]  

5. H. R. Morris, C. C. Hoyt, and P. J. Treado, “Imaging spectrometers for fluorescence and Raman microscopy: acousto-optic and liquid crystal tunable filters,” Appl. Spectrosc. 48(7), 857–866 (1994). [CrossRef]  

6. S. C. Gebhart, R. C. Thompson, and A. Mahadevan-Jansen, “Liquid-crystal tunable filter spectral imaging for brain tumor demarcation,” Appl. Opt. 46(10), 1896–1910 (2007). [CrossRef]   [PubMed]  

7. Z. Zheng, G. Yang, H. Li, and X. Liu, “Three-stage Fabry-Perot liquid crystal tunable filter with extended spectral range,” Opt. Express 19(3), 2158–2164 (2011). [CrossRef]   [PubMed]  

8. M. Abuleil and I. Abdulhalim, “Narrowband multispectral liquid crystal tunable filter,” Opt. Lett. 41(9), 1957–1960 (2016). [CrossRef]   [PubMed]  

9. D. A. Glenar, J. J. Hillman, B. Saif, and J. Bergstralh, “Acousto-optic imaging spectropolarimetry for remote sensing,” Appl. Opt. 33(31), 7412–7424 (1994). [CrossRef]   [PubMed]  

10. N. Gupta and V. Voloshinov, “Hyperspectral imager, from ultraviolet to visible, with a KDP acousto-optic tunable filter,” Appl. Opt. 43(13), 2752–2759 (2004). [CrossRef]   [PubMed]  

11. N. Gupta and V. Voloshinov, “Hyperspectral imaging performance of a TeO2 acousto-optic tunable filter in the ultraviolet region,” Opt. Lett. 30(9), 985–987 (2005). [CrossRef]   [PubMed]  

12. P. Wang and Z. Zhang, “Double-filtering method based on two acousto-optic tunable filters for hyperspectral imaging application,” Opt. Express 24(9), 9888–9895 (2016). [CrossRef]   [PubMed]  

13. H. Zhao, Z. Ji, G. Jia, Y. Zhang, Y. Li, and D. Wang, “MWIR thermal imaging spectrometer based on the acousto-optic tunable filter,” Appl. Opt. 56(25), 7269–7276 (2017). [CrossRef]   [PubMed]  

14. S. Manohar and D. Razansky, “Photoacoustics: a historical review,” Adv. Opt. Photonics 8(4), 586–617 (2016). [CrossRef]  

15. O. I. Korablev, D. A. Belyaev, Y. S. Dobrolenskiy, A. Y. Trokhimovskiy, and Y. K. Kalinnikov, “Acousto-optic tunable filter spectrometers in space missions [Invited],” Appl. Opt. 57(10), C103–C119 (2018). [CrossRef]   [PubMed]  

16. W. M. Porter and H. T. Enmark, “A System Overview of the Airborne Visible/Infrared Imaging Spectrometer (Aviris),” Proc. SPIE 834, 22–31 (1987). [CrossRef]  

17. R. W. Basedow, D. C. Carmer, and M. E. Anderson, “HYDICE system: implementation and performance,” Proc. SPIE 2480, 258–267 (1995). [CrossRef]  

18. R. D. Swift, R. B. Wattson, J. A. Decker Jr., R. Paganetti, and M. Harwit, “Hadamard transform imager and imaging spectrometer,” Appl. Opt. 15(6), 1595–1609 (1976). [CrossRef]   [PubMed]  

19. W. H. Steel, Interferometry (Cambridge University, 1983).

20. J. Kauppinen and V. M. Horneman, “Large aperture cube corner interferometer with a resolution of 0.001 cm(-1).,” Appl. Opt. 30(18), 2575–2578 (1991). [CrossRef]   [PubMed]  

21. C. L. Bennett, M. R. Carter, D. J. Fields, and J. A. M. Hernandez, “Imaging Fourier transform spectrometer,” Proc. SPIE 1937, 191–200 (1993). [CrossRef]  

22. G. Durry and G. Guelachvili, “High-information time-resolved step-scan Fourier interferometer,” Appl. Opt. 34(12), 1971–1981 (1995). [CrossRef]   [PubMed]  

23. C. M. Snively, S. Katzenberger, G. Oskarsdottir, and J. Lauterbach, “Fourier-transform infrared imaging using a rapid-scan spectrometer,” Opt. Lett. 24(24), 1841–1843 (1999). [CrossRef]   [PubMed]  

24. C. Zhang, B. Xiangli, B. Zhao, and X. Yuan, “A static polarization imaging spectrometer based on a Savart polariscope,” Opt. Commun. 203(1), 21–26 (2002). [CrossRef]  

25. J. Kauppinen, J. Heinonen, and I. Kauppinen, “Interferometers Based on the Rotational Motion,” Appl. Spectrosc. Rev. 39(1), 99–130 (2004). [CrossRef]  

26. R. K. Chan, P. K. Lim, X. Wang, and M. H. Chan, “Fourier transform ultraviolet-visible spectrometer based on a beam-folding technique,” Opt. Lett. 31(7), 903–905 (2006). [CrossRef]   [PubMed]  

27. P. R. Griffiths and J. A. de Haseth, Fourier Transform Infrared Spectrometry (Wiley-Interscience, 2007).

28. Q. Yang, “Moving corner-cube mirror interferometer and reflection characteristic of corner-cube mirror,” Appl. Opt. 49(21), 4088–4095 (2010). [CrossRef]   [PubMed]  

29. Y. Ferrec, J. Taboury, H. Sauer, P. Chavel, P. Fournet, C. Coudrain, J. Deschamps, and J. Primot, “Experimental results from an airborne static Fourier transform imaging spectrometer,” Appl. Opt. 50(30), 5894–5904 (2011). [CrossRef]   [PubMed]  

30. Q. Yang, B. Zhao, and D. Wen, “Principle and analysis of a moving double-sided mirror interferometer,” Opt. Laser Technol. 44(5), 1256–1260 (2012). [CrossRef]  

31. Q. Yang, L. Liu, and P. Lv, “Principle of a two-output-difference interferometer for removing the most important interference distortions,” J. Mod. Opt. 65(19), 2234–2242 (2018). [CrossRef]  

32. M. W. Kudenov and E. L. Dereniak, “Compact real-time birefringent imaging spectrometer,” Opt. Express 20(16), 17973–17986 (2012). [CrossRef]   [PubMed]  

33. S. Pacheco and R. Liang, “Snapshot, reconfigurable multispectral and multi-polarization telecentric imaging system,” Opt. Express 22(13), 16377–16385 (2014). [CrossRef]   [PubMed]  

34. P. Wang and R. Menon, “Computational multispectral video imaging [Invited],” J. Opt. Soc. Am. A 35(1), 189–199 (2018). [CrossRef]   [PubMed]  

35. T. Okamoto and I. Yamaguchi, “Simultaneous acquisition of spectral image information,” Opt. Lett. 16(16), 1277–1279 (1991). [CrossRef]   [PubMed]  

36. T. Okamoto, A. Takahashi, and I. Yamaguchi, “Simultaneous Acquisition of Spectral and Spatial Intensity Distribution,” Appl. Spectrosc. 47(8), 1198–1202 (1993). [CrossRef]  

37. M. Descour and E. Dereniak, “Computed-tomography imaging spectrometer: experimental calibration and reconstruction results,” Appl. Opt. 34(22), 4817–4826 (1995). [CrossRef]   [PubMed]  

38. M. R. Descour, C. E. Volin, E. L. Dereniak, K. J. Thome, A. B. Schumacher, D. W. Wilson, and P. D. Maker, “Demonstration of a high-speed nonscanning imaging spectrometer,” Opt. Lett. 22(16), 1271–1273 (1997). [CrossRef]   [PubMed]  

39. J. M. Mooney, V. E. Vickers, M. An, and A. K. Brodzik, “High-throughput hyperspectral infrared camera,” J. Opt. Soc. Am. A 14(11), 2951–2961 (1997). [CrossRef]  

40. F. D. Shepherd, J. M. Mooney, T. E. Reeves, P. Dumont, M. M. Weeks, and S. DiSalvo, “Adaptive MWIR spectral imaging sensor,” Proc. SPIE 7055, 705506 (2008). [CrossRef]  

41. W. Bao, Z. Ding, P. Li, Z. Chen, Y. Shen, and C. Wang, “Orthogonal dispersive spectral-domain optical coherence tomography,” Opt. Express 22(8), 10081–10090 (2014). [CrossRef]   [PubMed]  

42. A. A. Wagadarikar, N. P. Pitsianis, X. Sun, and D. J. Brady, “Video rate spectral imaging using a coded aperture snapshot spectral imager,” Opt. Express 17(8), 6368–6388 (2009). [CrossRef]   [PubMed]  

43. Y. Wu, I. O. Mirza, G. R. Arce, and D. W. Prather, “Development of a digital-micromirror-device-based multishot snapshot spectral imaging system,” Opt. Lett. 36(14), 2692–2694 (2011). [CrossRef]   [PubMed]  

44. Y. Murakami, M. Yamaguchi, and N. Ohyama, “Hybrid-resolution multispectral imaging using color filter array,” Opt. Express 20(7), 7173–7183 (2012). [CrossRef]   [PubMed]  

45. N. Hagen and M. W. Kudenov, “Review of snapshot spectral imaging technologies,” Opt. Eng. 52(9), 090901 (2013). [CrossRef]  

46. Q. Yang, “Static broadband snapshot imaging spectrometer,” Opt. Eng. 52(5), 053003 (2013). [CrossRef]  

47. Q. Yang, “Compact static infrared broadband snapshot imaging spectrometer,” Chin. Opt. Lett. 12(3), 031201 (2014). [CrossRef]  

48. Y. Murakami, K. Nakazaki, and M. Yamaguchi, “Hybrid-resolution spectral video system using low-resolution spectral sensor,” Opt. Express 22(17), 20311–20325 (2014). [CrossRef]   [PubMed]  

49. H. Rueda, D. Lau, and G. R. Arce, “Multi-spectral compressive snapshot imaging using RGB image sensors,” Opt. Express 23(9), 12207–12221 (2015). [CrossRef]   [PubMed]  

50. Q. Yang, “Broadband high-spectral-resolution ultraviolet-visible coherent-dispersion imaging spectrometer,” Opt. Express 26(16), 20777–20791 (2018). [CrossRef]   [PubMed]  

51. P. Connes and G. Michel, “Astronomical Fourier spectrometer,” Appl. Opt. 14(9), 2067–2084 (1975). [CrossRef]   [PubMed]  

52. T. Hirschfeld, “Fellgett’s Advantage in UV-VIS Multiplex Spectroscopy,” Appl. Spectrosc. 30(1), 68–69 (1976). [CrossRef]  

53. P. Luc and S. Gerstenkorn, “Fourier transform spectroscopy in the visible and ultraviolet range,” Appl. Opt. 17(9), 1327–1331 (1978). [CrossRef]   [PubMed]  

54. E. Voigtman and J. D. Winefordner, “The multiplex disadvantage and excess low-frequency noise,” Appl. Spectrosc. 41(7), 1182–1184 (1987). [CrossRef]  

55. L. W. Schumann and T. S. Lomheim, “Infrared hyperspectral imaging Fourier transform and dispersive spectrometers: comparison of signal-to-noise based performance [Invited],” Proc. SPIE 4480, 1–14 (2002). [CrossRef]  

56. P. B. Fellgett, “The nature and origin of multiplex Fourier spectrometry,” Notes Rec. R. Soc. 60(1), 91–93 (2006). [CrossRef]  

57. A. Barducci, D. Guzzi, C. Lastri, V. Nardino, P. Marcoionni, and I. Pippi, “Radiometric and signal-to-noise ratio properties of multiplex dispersive spectrometry,” Appl. Opt. 49(28), 5366–5373 (2010). [CrossRef]   [PubMed]  

58. A. Barducci, D. Guzzi, C. Lastri, P. Marcoionni, V. Nardino, and I. Pippi, “Theoretical aspects of Fourier Transform Spectrometry and common path triangular interferometers,” Opt. Express 18(11), 11622–11649 (2010). [CrossRef]   [PubMed]  

59. Q. Yang, “Coherent-dispersion spectrometer for the ultraviolet and visible regions,” Opt. Express 26(10), 12372–12386 (2018). [CrossRef]   [PubMed]  

60. Q. Yang, “Ultrahigh-resolution rapid-scan ultraviolet-visible spectrometer,” OSA Continuum 1(3), 812–821 (2018). [CrossRef]  

61. T. Okoshi, Three-Dimensional Imaging Techniques (Academic, 1976).

62. B. Javidi, F. Okano, and J. Y. Son, Three-Dimensional Imaging, Visualization, and Display (Springer, 2009).

63. N. T. Shaked, B. Katz, and J. Rosen, “Review of three-dimensional holographic imaging by multiple-viewpoint-projection based methods,” Appl. Opt. 48(34), H120–H136 (2009). [CrossRef]   [PubMed]  

64. X. Xiao, B. Javidi, M. Martinez-Corral, and A. Stern, “Advances in three-dimensional integral imaging: sensing, display, and applications [Invited],” Appl. Opt. 52(4), 546–560 (2013). [CrossRef]   [PubMed]  

65. M. A. Preciado, G. Carles, and A. R. Harvey, “Video-rate computational super-resolution and integral imaging at longwave-infrared wavelengths,” OSA Continuum 1(1), 170–180 (2018). [CrossRef]  

66. S. Komatsu, A. Markman, A. Mahalanobis, K. Chen, and B. Javidi, “Three-dimensional integral imaging and object detection using long-wave infrared imaging,” Appl. Opt. 56(9), D120–D126 (2017). [CrossRef]   [PubMed]  

67. A. Markman, X. Shen, and B. Javidi, “Three-dimensional object visualization and detection in low light illumination using integral imaging,” Opt. Lett. 42(16), 3068–3071 (2017). [CrossRef]   [PubMed]  

68. M. Martínez-Corral and B. Javidi, “Fundamentals of 3D imaging and displays: a tutorial on integral imaging, light-field, and plenoptic systems,” Adv. Opt. Photonics 10(3), 512–566 (2018). [CrossRef]  

69. J. V. Sweedler, R. D. Jalkian, G. R. Sims, and M. B. Denton, “Crossed Interferometric Dispersive Spectroscopy,” Appl. Spectrosc. 44(1), 14–20 (1990). [CrossRef]  

70. J. V. Sweedler, “The use of charge transfer device detectors and spatial interferometry for analytical spectroscopy,” https://arizona.openrepository.com/handle/10150/184683.

71. C. Palmer and E. Loewen, Diffraction Grating Handbook (Newport Corporation, 2005).

72. Q. Yang and W. Wang, “Compact orthogonal-dispersion device using a prism and a transmission grating,” J. Eur. Opt. Soc. 14(1), 8 (2018). [CrossRef]  

73. T. Huen, “Reflectance of thinly oxidized silicon at normal incidence,” Appl. Opt. 18(12), 1927–1932 (1979). [CrossRef]   [PubMed]  

74. K. Xu, “Monolithically integrated Si gate-controlled light-emitting device: science and properties,” J. Opt. 20(2), 024014 (2018). [CrossRef]  

75. E. V. Chandler, C. G. Durfee, and J. A. Squier, “Integrated spectrometer design with application to multiphoton microscopy,” Opt. Express 19(1), 118–127 (2011). [CrossRef]   [PubMed]  

76. M. Bass, G. Li, and E. V. Stryland, Handbook of Optics (McGraw-Hill, 2010).

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (8)

Fig. 1
Fig. 1 Equivalent orthographic view for the optics of the three-area-array coherent-dispersion stereo-imaging spectrometer (CDSIS). ZPD: zero path difference.
Fig. 2
Fig. 2 Equivalent light path diagram of the three-area-array CDSIS. A, B and C: area-array detectors located in the back focal plane of the collecting lens. S1, S2 and S3: entrance slits located at the front focal plane of the collimating lens (back focal plane of the objective lens).
Fig. 3
Fig. 3 Equivalent light path diagram from grating to detector for the orthographic view light.
Fig. 4
Fig. 4 Equivalent schematic diagram of the three-view stereo imaging for the CDSIS.
Fig. 5
Fig. 5 The x-axis coordinates on the detector plane of different wavelengths for the front view light, orthographic view light and back view light.
Fig. 6
Fig. 6 Several interferograms recorded simultaneously by area-array detector B in one scan period of the moving CCM for object point P i ( 0,0, Z i ) when the spectral resolution is 2 cm−1.
Fig. 7
Fig. 7 Spectrum obtained from Fourier transform of the three interferograms in Fig. 6 and its distribution in a given column of area-array detector B.
Fig. 8
Fig. 8 The detailed Spectrum obtained from Fourier transform of the three interferograms in Fig. 6.

Tables (4)

Tables Icon

Table 1 Comparisons of the CDSIS, the CDIS in [50], the CDS in [59], and the UVS in [60]

Tables Icon

Table 2 Comparisons of the CDSIS, interferometric, dispersive, and color filter imaging spectrometers

Tables Icon

Table 3 Comparisons of the CDSIS, the BSIS in [46], the IBSIS in [47], and the device in [72]

Tables Icon

Table 4 Wavelength difference versus Wavenumber difference for several wavelengths

Equations (50)

Equations on this page are rendered with MathJax. Learn more.

tanθ= y f ,
tanη= y f 1 = y f 2 .
tan ϕ 0 = x A x B f = x A f ,
tanρ= x A x B f 1 = x A f 1 = f f 1 tan ϕ 0 .
m λ i =g[ n( λ i )sinα+sin θ m ( λ i ) ].
λ 1 =g[ n( λ 1 )sinψ+sin θ 1 ( λ 1 ) ],
λ o =g[ n( λ o )sinψ+sin θ 1 ( λ o ) ],
λ k =g[ n( λ k )sinψ+sin θ 1 ( λ k ) ],
tanβ( λ 1 )=tan[ θ 1 ( λ o ) θ 1 ( λ 1 ) ]= x B1 f 2 ,
tanβ( λ k )=tan[ θ 1 ( λ k ) θ 1 ( λ o ) ]= x Bk f 2 .
x B1 = f 2 τ( λ o ) 1 τ 2 ( λ 1 ) τ( λ 1 ) 1 τ 2 ( λ o ) 1 τ 2 ( λ o ) 1 τ 2 ( λ 1 ) +τ( λ o )τ( λ 1 ) ,
x Bk = f 2 τ( λ o ) 1 τ 2 ( λ k ) τ( λ k ) 1 τ 2 ( λ o ) 1 τ 2 ( λ o ) 1 τ 2 ( λ k ) +τ( λ o )τ( λ k ) ,
τ( λ )= λ g n( λ )sinψ.
x A1 = f 2 τ( λ o ) 1 ς 2 ( λ 1 ) ς( λ 1 ) 1 τ 2 ( λ o ) 1 τ 2 ( λ o ) 1 ς 2 ( λ 1 ) +τ( λ o )ς( λ 1 ) ,
x Ao = f 2 τ( λ o ) 1 ς 2 ( λ o ) ς( λ o ) 1 τ 2 ( λ o ) 1 τ 2 ( λ o ) 1 ς 2 ( λ o ) +τ( λ o )ς( λ o ) ,
x Ak = f 2 τ( λ o ) 1 ς 2 ( λ k ) ς( λ k ) 1 τ 2 ( λ o ) 1 τ 2 ( λ o ) 1 ς 2 ( λ k ) +τ( λ o )ς( λ k ) ,
ς( λ )= λ g n( λ )sin( ψρ )= λ g n( λ )sin( ψarctan x A f 1 ).
x C1 = f 2 τ( λ o ) 1 ξ 2 ( λ 1 ) ξ( λ 1 ) 1 τ 2 ( λ o ) 1 τ 2 ( λ o ) 1 ξ 2 ( λ 1 ) +τ( λ o )ξ( λ 1 ) ,
x Co = f 2 τ( λ o ) 1 ξ 2 ( λ o ) ξ( λ o ) 1 τ 2 ( λ o ) 1 τ 2 ( λ o ) 1 ξ 2 ( λ o ) +τ( λ o )ξ( λ o ) ,
x Ck = f 2 τ( λ o ) 1 ξ 2 ( λ k ) ξ( λ k ) 1 τ 2 ( λ o ) 1 τ 2 ( λ o ) 1 ξ 2 ( λ k ) +τ( λ o )ξ( λ k ) ,
ξ( λ )= λ g n( λ )sin( ψ+ρ )= λ g n( λ )sin( ψ+arctan x A f 1 ).
y= f 2 f 1 y .
tanθ= Y H = y f ,
tan ϕ 0 = d H = x A f .
[ cos ϕ 0 0 sin ϕ 0 0 1 0 sin ϕ 0 0 cos ϕ 0 ][ 0 y f ]=[ fsin ϕ 0 y fcos ϕ 0 ]=γ R T [ X X S Y Y S Z Z S ].
R=[ a 1 a 2 a 3 b 1 b 2 b 3 c 1 c 2 c 3 ],
a 1 =cosφcosκsinφsinωsinκ a 2 =cosφsinκsinφsinωcosκ a 3 =sinφcosω b 1 =cosωsinκ b 2 =cosωcosκ b 3 =sinω c 1 =sinφcosκ+cosφsinωsinκ c 2 =sinφsinκ+cosφsinωcosκ c 3 =cosφcosω }.
x =f a 1 ( X X S )+ b 1 ( Y Y S )+ c 1 ( Z Z S ) a 3 ( X X S )+ b 3 ( Y Y S )+ c 3 ( Z Z S ) ,
y =f a 2 ( X X S )+ b 2 ( Y Y S )+ c 2 ( Z Z S ) a 3 ( X X S )+ b 3 ( Y Y S )+ c 3 ( Z Z S ) ,
X=[ a 1 x + a 2 y a 3 f c 1 x + c 2 y c 3 f ]( Z Z S )+ X S ,
Y=[ b 1 x + b 2 y b 3 f c 1 x + c 2 y c 3 f ]( Z Z S )+ Y S .
GSD= Hb f 1 f f 2 ,
FOV Y =2arctan( θ max )=2arctan( Nb f 1 2f f 2 . ),
W=NGSD= HNb f 1 f f 2 ,
L S = Nb f 1 f 2 .
ΔZ=k× GSD 2d/H .
M B | x B1 x Bk |/b .
M A | x A1 x Ak |/b .
M C | x C1 x Ck |/b .
OPD B ( l,y )= 2l cosη =2l 1+ tan 2 η =2l 1+ ( y f 2 ) 2 .
I B ( l,y )= 0 B( σ )[ 1+cos( 4πσl 1+ ( y f 2 ) 2 ) ]dσ .
OPD A ( l,y )=2 ( l cosρ ) 2 + ( ltanη ) 2 =2l 1+ ( f f 1 tan ϕ 0 ) 2 + ( y f 2 ) 2 .
I A ( l,y )= 0 B( σ )[ 1+cos( 4πσl 1+ ( f f 1 tan ϕ 0 ) 2 + ( y f 2 ) 2 ) ]dσ .
δ σ B = 1 2 OPD Bmax ( l,0 ) = 1 4 l max .
δ σ B = 1 2 l max .
δ σ A = 1 2 OPD Amax ( l,0 ) = f 1 4 l max f 1 2 + ( ftan ϕ 0 ) 2 .
δ σ A = f 1 2 l max f 1 2 + ( ftan ϕ 0 ) 2 .
K B ( y )= OPD Bmax ( l,y ) χ =4 σ max l max 1+ ( y f 2 ) 2 ,
K A ( y )= OPD Amax ( l,y ) χ =4 σ max l max 1+ ( f f 1 tan ϕ 0 ) 2 + ( y f 2 ) 2 .
n 2 =1+ 0.6961663 λ 2 λ 2 0.0684043 2 + 0.4079426 λ 2 λ 2 0.1162414 2 + 0.8974794 λ 2 λ 2 9.896161 2 .
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.