Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Catadioptric sensor concept with interlaced beam paths for imaging and pinpoint spectroscopy

Open Access Open Access

Abstract

This paper presents the concept, optical design, and implementation of a catadioptric sensor for simultaneous imaging of a scene and pinpoint spectroscopy of a selected position, with object distances ranging from tens of centimeters to infinity and from narrow to wide adjustable viewing angles. The use of reflective imaging elements allows the implementation of folded and interlaced beam paths for spectroscopy and image acquisition, which enables a compact setup with a footprint of approximately ${90}\;{\rm mm} \times {80}\;{\rm mm}$. Although the wavelength range addressed extends far beyond the visible spectrum and reaches into the near infrared (${\sim}{400}\;{\rm nm}$ to 1000 nm), only three spherical surfaces are needed to project the intermediate image onto the image detector. The anamorphic imaging introduced by the folded beam path with different magnification factors in the horizontal and vertical directions as well as distortion can be compensated by software-based image processing. The area of the scene to be spectrally analyzed is imaged onto the input of an integrated miniature spectrometer. The imaging properties and spectroscopic characteristics are demonstrated in scenarios close to potential applications such as product sorting and fruit quality control.

© 2023 Optica Publishing Group under the terms of the Optica Open Access Publishing Agreement

1. INTRODUCTION

For many applications, it is important to obtain both spatial and spectral information from an observed scene. In this field of “spectral imaging” enormous progress has been made in recent years and various methods have been developed. Important applications of “spectral imaging” include medical diagnosis [1,2], biomedical research [3], food safety inspection and control [4,5], agriculture and forestry [68], and forensic detection [9,10], to name only a few. Since the instrumental requirements for spectral imaging are very specific to the respective applications, a wide variety of different technical solutions have been developed. Particularly sophisticated hyperspectral solutions, offering a highly differentiated spectrum for each highly resolved field position, comprise time sequential techniques such as point scanning (“whiskbroom”), line scanning (“pushbroom”), and spectral band sequential (“staring”) techniques as well as “snapshot” spectral imaging. On the other side of the complexity scale are simple filter-based multispectral imaging techniques that use a camera with little more than three different color or wavelength channels [1116]. Here, the use of plasmonic color filters has proven to be particularly successful [1720].

The variety of very different technical implementations shows that there is no single optimum solution for spectral imaging that is equally suitable for all application scenarios. Instead, the individual technical specifications appear to contradict each other. Important criteria to be weighed against each other include wavelength range, spatial and spectral resolution, as well as acquisition time and throughput. The over-fulfillment in one criterion that is not necessary for a particular application usually leads to limitations in another criterion, with additional aspects such as a compact design, robustness, and acceptable acquisition costs further complicating system selection. In particular, it seems not always necessary to record the complete spectral information for each individual pixel. Often it is sufficient to select a specific local position from the scene, which is then spectrally analyzed in detail. The preselection of the position to be analyzed can be determined, for example, by size, shape, or texture from the image information. For this approach of combining imaging and local spectroscopy, there are examples in the literature that are instrumentally very specifically adapted to the respective application, e.g., for microscopy-based biological tissue characterization [21] or for endoscopic systems [2224]. In [25], a multimodal measurement platform combining high-resolution imaging and local spectroscopy is presented. The integrated camera as well as spectrometers ranging from 190 to 1700 nm. The platform is only suitable for samples at very small distances (microscopy) and involves a complex and bulky setup. A major disadvantage of the platform is the separate paths for imaging and spectroscopy, so that simultaneous observation of the same sample position is not possible. A specific system for observing a distant scene that is suitable for studying the detonation interaction is presented in [26]. The instrument offers high performance with a spectral resolution of 0.24 nm but requires a very large installation space (length scale larger than 300 mm) and includes complex and expensive optical subsystems, especially for spectral detection by a sophisticated telephoto lens (300 mm) and a zoom lens (100–300 mm) for the imaging path.

In this contribution, we present a new catadioptric concept of such a tailored optical system that enables simultaneous imaging with pinpoint spectroscopy of a targeted position. The device is designed to be used flexibly. It allows object distances from a few tens of centimeters to infinity and the adjustable imaging of a scene from narrow to wide viewing angles. Catadioptric approaches have recently been used for spectral imaging, e.g., to take advantage of the combination of omnidirectional and hyperspectral imaging [27], or for the realization of a compact short-wavelength infrared spectrometer using a catadioptrically curved prism to compensate for high-order aberrations [28]. Here we use the incorporation of reflective imaging elements to implement folded and interlaced beam paths for spectroscopy and image acquisition, thus enabling a compact setup. It is particularly advantageous that only three simple spherical surfaces are required to project the intermediate image onto the image detector, even though the wavelength range extends from 400 nm to the near-infrared range at 1000 nm. Furthermore, this catadioptric image acquisition system achieves appropriate field correction (especially astigmatism, coma, and field curvature) which would require much more complex systems (e.g., Petzval or landscape lenses) in classical refractive approaches [29].

In the first section, the basic concept of the system is presented. The overall optical design including both the imaging and the spectrometer channel are discussed. The following section contains the implementation aspects of the system. In the final section, the performance of the system is presented based on test measurements.

2. BASIC CONCEPT AND OPTICAL DESIGN

Figure 1 shows an overview of the optical sensor design. The optical design includes the interlaced channels for imaging and spectroscopy. The observed scene is to the left of the objective lens O at distances from 250 mm to infinity. The objective lens creates an intermediate image of the scene on the entrance surface of a cubic beam splitter (BS). A reticle (R) is fixed in front of the beam splitter so that the structure of the reticle coincides with the position of the intermediate image. The subsequent beam splitter separates the imaging (rays in dark blue) and the spectroscopic paths (rays in red color). The spectroscopic channel images the central region of the intermediate image onto the end of an optical fiber (F) via a single mirror (M1). Hereby, a numerical aperture of 0.2 is achieved and the image scale is ${\beta _{\rm{spec}}} = - {1}$. The fiber guides the light to the input aperture of a miniature spectrometer.

 figure: Fig. 1.

Fig. 1. Optical design of the sensor’s interlaced spectroscopic (red) and imaging (blue) beam paths. The objective lens O creates an intermediate image at the entrance surface of a beam splitter cube BS which coincides with the target structure of a reticle R. The sensor has the ability to address very different object distances (${\sim}{25}\;{\rm mm}$ to infinity) and from narrow to wide adjustable viewing angles.

Download Full Size | PDF

In the imaging path, the rays from the intermediate image plane pass through the beam splitter without deflection. The optical system for the imaging path comprises a refractive fused-silica lens (L) which is directly cemented onto the exit surface of the beam splitter and three successive mirrors (M2–M4). More specifically, M2 has a concave curvature and M4 has a convex curvature, while M3 is a flat mirror with no optical power, used only for beam redirection. The imaging beam path from the intermediate image plane to the detector requires only the remarkably very small number of three beam-forming spherical surfaces. Classical refractive systems that offer comparable performance over the angular range addressed would require an achromatic or, e.g., a Petzval-type multilens solution even for the limited visible spectral range [29]. To extend the spectral range to the NIR with refractive solutions, further lenses would be necessary, which would then also require special materials, including optical materials with anomalous relative partial dispersion. Instead of exclusively applying to reflective surfaces, a refractive element is used to achieve an optical effect at the smallest possible distance from the intermediate image. This allows for correction of field curvature and to keep the beam diameter small resulting in smaller diameters of the subsequent optical elements. Furthermore, it supports correction of other field aberrations such as coma and astigmatism. The influence of the lens on axial chromatic aberration is small due to the employed low-dispersion material fused silica. Replacing the lens with another reflective surface would introduce larger distances and thus increase the installation space. The imaging path is anamorphic with slightly different lateral magnification factors for horizontal (${\beta _{\rm{hori}}} = - {2.74}$) and vertical (${\beta _{\rm{vert}}} = - {2.41}$) directions. In the imaging path, the numerical aperture is 0.1 on the intermediate image side and 0.036 (horizontal) respectively 0.041 (vertical) on the detector plane. Advantageously, the use of mirrors allows the drastic reduction of chromatic aberrations and the implementation of a folded beam path, which enables a compact setup. The basic footprint of the optical beam path covers an area of approximately ${90}\;{\rm mm} \times {80}\;{\rm mm}$, which is roughly the size of a smartphone. The mirror for the spectral path has a coating that covers a wide wavelength range from 370 to 1100 nm. The reflective layer for the imaging path is designed for the visible spectral range (400–700 nm) with a reflectivity above 95% and also achieves good reflectivity in the NIR range larger than 92%. Due to the light sensitivity of spectral detection, a splitting ratio of 10/90 between imaging and spectral channels is selected for the beam splitter.

Tables Icon

Table 1. Optical Parameters, Distances and Dimensions for All Optical Elements Involved, for Both the Spectroscopic and Imaging Paths (Starting from the Intermediate Image)

 figure: Fig. 2.

Fig. 2. (a) Subdivision of the observed scene into different areas: s central area used for spectroscopic selection, ${{d}_x} \times {{d}_y}$ area of high spatial resolution, ${{r}_x} \times {{r}_y}$ image area with reduced resolution, ${{c}_x} \times {{c}_y}$ size of detector chip; (b) numbers 1–8 indicating different field positions. In between and below (a) and (b): spot diagrams for the respective field positions and corresponding Airy disk for reference (Airy disk radius: 10.1 µm). The gray values below each spot diagram give the RMS spot radii for the corresponding spot diagrams. Spot diagrams are shown for four different wavelengths from the blue visible to the near-infrared range (486 nm, 587 nm, 656 nm, and 852 nm).

Download Full Size | PDF

The details on optical parameters, distances, and dimensions for both the spectroscopic path and the imaging path are summarized in Table 1.

The optical design approach allows a high degree of flexibility in the choice of objectives, depending on the size and distance of the scenes and objects to be investigated. Specifically, different lenses from the M12 lens series (S-mount) can be used, ranging from wide-angle to telephoto lenses (focal length approximately between 2 and 50 mm). In the setup described here, a M12 lens with 3.4 mm focal length and $f/{2.8}$ is used as the standard lens [30]. Alternatively, a selection of lenses with focal lengths of 5.4 mm, 16 mm, 25 mm, 35 mm, and 50 mm was available. A 1/2 in. image field (${6.4} \times {4.8}\;{\rm mm}$) is specified for the M12 lenses.

The observed scene can be divided into different sections, which have different geometric dimensions in the intermediate image plane and in the final image plane. Figure 2(a) shows schematically the different sections and indicates the characteristic axes and orientations. The corresponding dimensions for both the intermediate image plane and the final detection plane are given in Table 2.

Tables Icon

Table 2. Schematic Classification of Different Areas of the Observed Scene for both the Intermediate Image Plane and the Final Detection Plane [Indication See Fig. 2(a)]

The central region (s) used for spectroscopic analysis corresponds to a diameter of 750 µm in the final imaging plane. A surrounding field of ${2.74}\;{\rm mm} \times {2.41}\;{\rm mm}$ is imaged with high spatial resolution (to a great extent diffraction limited). A peripheral area of approximately ${5.48}\;{\rm mm} \times {3.37}\;{\rm mm}$ does not provide high resolution, but still can be used for coarse image recognition or motion perception, for example. The camera’s chip size of  ${7.5}\;{\rm mm} \times {4.9}\;{\rm mm}$ would allow an even larger image section, but the image information would then no longer be usable due to the significantly lower resolution. In addition, the full chip size of the camera cannot be used due to artificial vignetting which is caused by the cropping of the field due to the small diameters of the lens and mirrors. The artificial vignetting causes a decrease of the relative illumination down to ${\sim}{70}\%$ at the edge of the reduced-resolution area.

Figure 2(b) shows once again the different sections of the observed scene, with selected field positions now marked with numbers (1–8). Spot diagrams for four different wavelengths (486 nm, 587 nm, 656 nm, and 852 nm) are evaluated for all selected field positions. The corresponding spot diagrams marked with the numbers are also displayed in Fig. 2. The spot diagrams also show the diffraction-limiting Airy disk at 656 nm for reference (black circle), which is slightly elliptical due to the different scale factors in horizontal and vertical directions. The Airy radius calculated based on the systems working $f$-number is 10.1 µm. The root mean squared (RMS) spot radii are denoted below each spot diagram (cf. Fig. 2). For better visibility, the reference sizes of the spot diagrams vary for the different field positions (see individual scale). The spot diagrams in the bottom row of Fig. 2 belong to the central horizontal axis of the detection plane. The spot diagram numbered 1 represents the imaging center. The size of the spot diagram is comparable to the diameter of the Airy disk. The Strehl ratios for the four wavelengths (486 nm/587 nm/656 nm/852 nm) are 80.9/97.9/90.4/75.1, indicating a high image quality. A closer look at the spot distributions reveals a small asymmetry between the left and right areas, while the upper and lower areas appear mirror symmetrical. This can be explained by the orientation of the optical axis, which is not perpendicular to the center of the image plane. The optical axis is inclined by 15° in the plane of incidence containing the horizontal image axis. Conversely, there is no inclination with respect to the vertical axis. The spot diagrams 2 and 3 belong to field positions on opposite sides of the horizontal axis, each with a distance of ${\sim}{1.35}\;{\rm mm}$ to the center (image side). The extent of the two spot distributions is, except for the blue wavelength, within the range of the diameter of the Airy disk, which expresses a high imaging quality. The corresponding Strehl ratios are 54.6/93.0/94.0/84.3 (spot 2) and 45.3/91.4/96.4/89.3 (spot 3). The lower performance for the blue wavelength is caused by axial chromatic aberration. It should be noted that a simple refocus would allow an even better performance for the acquisition of standard RGB images (NIR wavelengths blocked by filter) at the price of a lower NIR performance. The distributions of the two spot diagrams are mirrored and show a similar shape, but differ in detail. In particular, the extent of the spot distribution on the left side is larger than on the right side at short and medium wavelengths. Here, the differences are due to the tilted optical axis, too. The centroid shift of the spot diagrams of different wavelengths at a field position can be attributed to chromatic aberrations caused by the lens L cemented to the beam splitter. This shift is disadvantageous for full-spectrum images but can be corrected for RGB images by image processing resulting in smaller RMS radii and thus, a better image quality. At more distant field positions on the horizontal axis (positions 6 and 7), a significant increase in the extent of the spot diagrams can be observed (see scale bar). In these areas, high resolution is not achieved and the image information can only be used for rough recognition.

The difference in the quality of the image content between the left and right sides caused by the tilted optical axis becomes more apparent. The spot diagrams in the central part of Fig. 2 correspond to field positions 4, 5, and 8. Although the field positions lie on the vertical axis running through the center of the image, no corresponding mirror symmetry can be seen in the respective spot distributions. The spot diagrams oriented obliquely to the coordinate system can be assigned to the non-perpendicular oriented optical axis and the vertical field position. At field positions 4 and 5, the RMS spot radii of the spot diagrams for all four wavelengths are still comparable with the Airy disk radius, but at field position 8 the imaging quality deteriorates.

To further assess the imaging quality of the imaging beam path, the polychromatic modulation transfer functions for the four design wavelengths for different field positions are shown in Fig. 3. Due to the slightly different $f$-numbers for the tangential and sagittal orientations, two corresponding curves are indicated in each case (dotted lines: sagittal; solid lines: tangential). In addition, the curves for the diffraction-limited cases are shown for comparison (black curves). For the center field position (position 1; blue curves) the contrast transfer is close to the diffraction limit. The contrast transfer functions for positions 2 and 3 on the horizontal axis differ only slightly, which again is due to the slightly tilted optical axis. A similar contrast transfer results for the vertical field position (dark yellow). Note that, in most cases, a higher contrast results for the sagittal orientation than for the tangential orientation. An exception occurs for the vertical field point 5; here sagittal and tangential contrast curves are interchanged due to the asymmetry of the system.

 figure: Fig. 3.

Fig. 3. Polychromatic modulation transfer functions (MTFs) for different field positions (number of field positions refers to Fig. 2). Both tangential (solid lines) and sagittal (dotted lines) orientations are considered for the different field positions. For reference, the diffraction-limited case is included (black line).

Download Full Size | PDF

Due to the folded beam path with multiple directional changes of the optical axis as well as the non-coincidence of the optical axis with the symmetry axes of the different mirrors, keystone distortions occur in addition to the anamorphic imaging. Figure 4 shows the simulated distortion for the smaller high-resolution area (left) and for the slightly larger area with reduced resolution (right). The displayed axes and scales represent the alignment and distances including the anamorphic image transfer in the final image plane. Small blue dots are displayed to illustrate the local distortion, indicating the deviation from the underlying rectangular mesh structure. Especially when looking at the larger area, it becomes clear that the distortion is not symmetrical in the horizontal direction. Both horizontal and vertical magnification factors are larger on the left side of the image plane than on the right. Quantitatively, the optical design resulted in a maximum distortion of 6.53% for the larger area. For the smaller, high-resolution area, a maximum distortion of 3.04% was determined.

 figure: Fig. 4.

Fig. 4. Simulated keystone distortion in the anamorphic imaging path for area of high spatial resolution (left) and larger area with reduced resolution (right). Small blue dots illustrate the local distortion as a deviation from the underlying rectangular mesh structure. A maximum distortion of 3.04% is found for the high-resolution area and 6.53% for the wider range.

Download Full Size | PDF

3. MECHANICAL DESIGN, SYSTEM IMPLEMENTATION, AND QUALIFYING MEASUREMENTS

The optical design is followed by the mechanical design and the implementation of the overall system. Figure 5(a) shows a photo of the finally assembled sensor without the protective cover. For better recognition, Fig. 5(b) also shows a CAD drawing in which the individual components are highlighted in color. The primary goal of the implemented demonstrator was to provide a proof of concept. While the mechanical design was generally intended to achieve a small installation space, it was also intended to offer a high degree of flexibility for assembly, component replacement, and adjustment. This means that for subsequent development steps there is a high potential for a further significant size reduction of the sensor system.

 figure: Fig. 5.

Fig. 5. (a) Photo of the final assembled sensor without the protective cover, (b) CAD drawing with the individual components highlighted in color. 1, objective lens; 2, reticle and entrance surface of beam splitter; 3, M1 mirror for spectroscopic path; 4, fiber to spectrometer; 5, 6, 7, mirrors (M2–M4) for imaging path; 8, detector; 9, MMS1.

Download Full Size | PDF

The sensor system is built up in two levels. The upper level [light blue plate in Fig. 5(b)] contains the optical components for both the imaging and spectral paths, including the image detector and fiber end as input to the spectrometer. The spectrometer itself is located in the lower level (base plate of the system). Parts of the electronics and cabling are also fixed there. We used a miniature spectrometer Zeiss MMS 1 [31] covering a wavelength range of 310–1100 nm and providing a specified spectral resolution of approximately 10 nm.

The mountings for the optical elements on the upper level are mechanically designed and manufactured in such a way that the accuracy specifications and tolerances from the optical design can be met and no adjustment of these components is required subsequently to the assembly. The only components of the system that are adjustable are the image detector and the input aperture (fiber end) of the spectrometer. With the xyz-adjustment of these components, all design tolerances of the system can be compensated. As an imaging detector a board camera DFM 37UX178ML with ${3.072} \times {2.048}$ pixels (pixel size ${2.4}\;{\unicode{x00B5}{\rm m}} \times {2.4}\;{\unicode{x00B5}{\rm m}}$) was used [32]. After assembling and aligning all components, the image sensor is adjusted first. Subsequently, the fiber end is positioned so that the central area of the image meets the entrance aperture of the fiber. To simplify integration and alignment, the fiber end of the spectrometer is placed slightly above the beam splitter. This means that the mirror M1 [no. “3” in Figs. 5(a) and 5(b)] deflects the backreflected light slightly upwards (approximately 9° deflection angle). It should also be mentioned that the contour of the M4 mirror [no. “8” in Figs. 5(a) and 5(b)] was trimmed. This was necessary to prevent the spectroscopic beam path from being disturbed by the components of the imaging beam path.

In preparation for use of the sensor, both the imaging and spectroscopic channels must be adjusted, corrected, and calibrated. Especially for the imaging channel, distortion and the anamorphic magnification factor must be compensated. For this purpose, a simple test scene is recorded which is used for correction. The test image is a square grid which is displayed in full format on a computer screen. The calibration procedure is exemplarily shown for an objective lens with a focal length of 3.4 mm and a $f$-number of 2.8. Here, the distance between the screen (object) and the sensor was approximately 76 cm. Figure 6(a) shows such an uncorrected test image.

 figure: Fig. 6.

Fig. 6. Test measurements and correction of the imaging path. (a) Uncorrected test image of a square mesh structure. The central, high-resolution area is indicated (red triangles; blue boundary lines). (b) Distortion and anamorphically corrected image.

Download Full Size | PDF

The screen is placed in an otherwise dark surrounding. In addition to the screen and the grid structure, the reticle from the intermediate image plane can also be seen. The boundaries of the central area with the highest resolution (see ${{d}_x}$ and ${{d}_y}$ in Table 1) are marked with red triangles and connecting lines. The different magnification factors in the horizontal and vertical directions, the resolution decrease toward the outer ranges, as well as the field-dependent distortion are clearly visible. The raw image with the known size of the reticle was used to measure the lateral magnification factors in the imaging path (intermediate image to camera) in the horizontal and vertical directions. Due to the small size of the reticle, one can assume a good approximation for the paraxial magnification. The measured lateral magnification factors are ${-}{2.98}\;{\pm}\;{0.1}$ for the horizontal direction and ${-}{2.48}\;{\pm}\;{0.1}$ for the vertical direction. Both magnification factors are slightly larger (horizontal ${\sim}{9}\%$, vertical ${\sim}{3}\%$) than the values predicted by the optical design (${\beta _{\rm{hori}}} = - {2.74}$ and ${\beta _{\rm{vert}}} = - {2.41}$), which is most probably caused by optical and mechanical tolerances of the system. In a second step, a test target was used to determine the distortion of the imaging path. Here, we measured a maximum distortion of ${6.4}\;{\pm}\;{2}\%$ (design: 3.04%) for the high-resolution area and ${7.2}\;{\pm}\;{2}\%$ (6.53%) for the region with reduced resolution. Again, these values are larger than the design values of 3.04% and 6.53% for the high-resolution and reduced resolution areas, respectively. Most likely, this is also caused by optical and mechanical tolerances of the system.

Figure 6(b) shows the distortion and anamorphically corrected image of the central area, marked with a blue square, within the larger area of reduced resolution (see ${{r}_x}$ and ${{r}_y}$ in Table 1). The central area is well corrected, but in the outer area, especially toward the left edge of the image, a decreasing resolution can be seen, which was also expected from the optical design. This correction procedure was carried out analogous for each of the objective lenses with different focal lengths. Subsequently, when real scenes were examined with the sensor, the derived software-based correction mechanism was applied to all acquired images. To evaluate the spectroscopic channel, a simple scene was set up in which a wavelength calibration standard (“Spectralon,” Labsphere) was placed in front of a homogeneous background. This wavelength calibration standard was illuminated with the light of a HgCd lamp. The inset in Fig. 7 shows this test scene captured with the imaging beam path of the sensor. Simultaneously, the area of the wavelength calibration standard that lies within the vertical and horizontal boundaries of the target structure is mapped by the spectroscopic channel onto the corresponding fiber input and used for the spectroscopic evaluation. The resulting spectrum is displayed in Fig. 7. The prominent lines of the HgCd source are clearly visible (e.g., Hg lines at 404.6 nm, 435.8 nm, 546.2 nm, and Cd lines at Cd 467.8 nm, 479.9 nm, and 508.6 nm). The measured spectrum extends well beyond 1000 nm (e.g., the 1013.9 nm Hg line is still recognizable), although the sensitivity in this spectral range decreases strongly due to the use of a Si line detector in the spectrometer. From the measurement of the full width at half-maximum of the line peaks, a spectral resolution of 12 nm can be determined, which agrees well with the manufacturing specifications of the spectrometer. Due to the spectral resolution, the two Cd lines at 467.8 nm and 479.9 nm appear partly overlapping and not completely separated. This test measurement is suitable for wavelength calibration and demonstrates an applicable spectral measurement range for the spectroscopic channel, from about 400 nm to over 1050 nm. It should be noted that although the miniature spectrometer allows shorter wavelengths to be detected, the range below 400 nm is not transmitted through the front lens.

 figure: Fig. 7.

Fig. 7. Calibration of the spectroscopic channel. A calibration standard was illuminated by a HgCd lamp and the reflected light was mapped onto the fiber input of the spectrometer. The spectrum shows the prominent lines of the HgCd source. The inset shows the simultaneously captured image of the scene with the calibration standard in the center.

Download Full Size | PDF

 figure: Fig. 8.

Fig. 8. (a)–(f) Photos of several colored, equally sized, and unsorted plastic caps taken with the imaging path of the sensor. In each of the images a cap with a different color is in the center. (g) Simultaneously recorded spectra for each of the different colored caps in the central area. The spectra differ significantly. Although the light red and the red cap in the center of images (e) and (f) appear very similar, the respective spectral measurements allow a clear distinction.

Download Full Size | PDF

4. EXAMPLES FOR APPLICATION RELATED MEASUREMENTS

After preparation of the sensor, the imaging properties and spectroscopic characteristics are tested, still in the laboratory but in scenarios close to potential applications. Figures 8(a)–8(f) show several photos of different colored, equally sized, and unsorted plastic caps taken with the imaging path of the sensor. The scene can be considered as a model situation for a simple sorting process that could be used in the packaging industry or for product recycling. In each of the images a cap with a different color is in the center.

Simultaneously with the image acquisition, a spectrum is recorded from the central area and, thus, for each of the different caps [see Fig. 8(g)]. The six spectra differ significantly. The last series of photos provides an interesting example to demonstrate the advantage of spectral measurement capabilities. In the two images, once there is a light red cap in the center, the other time a red one. The different colors of the caps are very difficult or almost impossible to distinguish from the image information alone. With the additional information available from the spectral measurement, a clear distinction can be made. The spectra of the two caps differ significantly in the spectral range from 500 nm to 600 nm and also in the NIR range greater than 850 nm.

Figures 9(a)–9(d) show images of three objects that appear green in visual perception (a natural green bell pepper, a spinach leaf, and an artificial apple). Figures 9(a)–9(c) are RGB images with the objects at commutated positions. The NIR image of Fig. 9(d) was taken by replacing the IR block filter with a long-pass filter (cutoff wavelength at 740 nm) blocking the visible wavelengths. The sensitivity of the camera with the Bayer pattern was used to calculate the shown monochrome image. The reflectance spectrum of all three objects is shown below the photos in Fig. 9. All three objects show a “green-peak” at approximately 550 nm, which is responsible for the green appearance in visual perception. The spectral maximum in the green spectral region is higher for the artificial apple than for the two natural objects, which also correlates with the brighter appearance of the apple in the photos.

 figure: Fig. 9.

Fig. 9. (a)–(c) RGB images of three objects that appear green in visual perception located at commutated positions (a natural green bell pepper, a spinach leaf, and an artificial apple). (d) For comparison a near-infrared image taken of the scene which was acquired with the setup by replacing the IR block filter by a long-pass filter blocking the visible light (cutoff wavelength at 740 nm). (e) The reflectance spectra of all three objects show a “green-peak” at approximately 550 nm. At around 680 nm and 710 nm, a significant spectral difference occurs between artificial and natural objects due to the “red edge” effect.

Download Full Size | PDF

A dramatic distinguishing feature between natural and artificial objects concerns the increase in reflectivity in the near-IR wavelength range. For the artificial apple, the reflectance in the near-IR range is only slightly above the maximum of the green peak, while for the natural objects the reflectance increases massively, by a factor of 7–10. This characteristic of natural plant objects is attributed to their chlorophyll content and the associated “red edge” effect, which increases reflectance at about 680 nm and 710 nm. After this comparison between artificial and natural plant objects, the next example shows a more application-oriented use, aimed at controlling fruit in the food industry. Figure 10 shows in the upper part photos taken from the same bananas on four consecutive days. The comparison of the four photos does not show a clearly discernible difference. In contrast, the comparison of the spectra recorded from the bananas on the four consecutive days shows significant differences (see lower part of Fig. 10). The reflectance spectrum from the first day, for example, shows a clear local minimum at about 680 nm, which can be attributed to the absorption of chlorophyll. This local minimum becomes weaker from day to day and is no longer visible on the fourth day. Temporal changes in the spectra are also evident in the long-wavelength region. In the wavelength range, starting at 800 nm and then more strongly from 900 nm, the reflectance decreases significantly with each day. Spectral measurement thus provides a characterization capability that would not be accessible through visual image information alone.

 figure: Fig. 10.

Fig. 10. Four photos taken of the same bananas on four consecutive days show no clearly discernible difference. However, the spectra recorded at the same moment can be clearly distinguished (e.g., the local minimum at about 680 nm becomes weaker from day to day).

Download Full Size | PDF

5. CONCLUSION

The simultaneous detection of spectral and spatially resolved information reveals an enormously wide range of applications. However, the development of instrumental solutions that assign a detailed, broadband spectrum to each highly resolved pixel of a snapshot image is very challenging and can only be implemented with huge effort and complex approaches. These all-in-one solutions are often not suitable for applications under real conditions, especially when considering limitations in terms of installation space, complexity, robustness, and, last but not least, costs. Rather, a perfect compromise must be found for real applications that covers exactly the specifications required for the respective requirements.

In this contribution we have presented a concept that offers such a compromise. In application, imaging can be used, for example, to find an object in a scene, which is then detected spectroscopically. Dispensing the spectroscopic analysis of each individual object point reduces the complexity of the optical setup and simplifies and accelerates data evaluation. The folded catadioptric beam path of the imaging optics is not designed for high image quality over the entire field, but is optimized for the central area, thus enabling a reduction in complexity and installation space.

Funding

Scia Systems GmbH; Carl Zeiss Spectroscopy GmbH; Deutsche Forschungsgemeinschaft (497866273); Bundesministerium für Bildung und Forschung (13FH657IX6).

Disclosures

The authors declare no conflicts of interest.

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

REFERENCES

1. G. Lu and B. Fei, “Medical hyperspectral imaging: a review,” J. Biomed. Opt. 19, 010901 (2014). [CrossRef]  

2. B. Fei, “Hyperspectral imaging in medical applications,” in Data Handling in Science and Technology (Elsevier, 2020), Vol. 32, pp. 523–565.

3. Q. Li, X. He, Y. Wang, H. Liu, D. Xu, and F. Guo, “Review of spectral imaging technology in biomedical engineering: achievements and challenges,” J. Biomed. Opt. 18, 100901 (2013). [CrossRef]  

4. Y. Z. Feng and D. W. Sun, “Application of hyperspectral imaging in food safety inspection and control: a review,” Crit. Rev. Food Sci. Nutr. 52, 1039–1058 (2012). [CrossRef]  

5. Y. Liu, H. Pu, and D.-W. Sun, “Hyperspectral imaging technique for evaluating food quality and safety during various processes: a review of recent applications,” Trends Food Sci. Technol. 69, 25–35 (2017). [CrossRef]  

6. A. Gongal, S. Amatya, M. Karkee, Q. Zhang, and K. Lewis, “Sensors and systems for fruit detection and localization: a review,” Comput. Electron. Agric. 116, 8–19 (2015). [CrossRef]  

7. T. Adão, J. Hruška, L. Pádua, J. Bessa, E. Peres, R. Morais, and J. J. Sousa, “Hyperspectral imaging: a review on UAV-based sensors, data processing and applications for agriculture and forestry,” Remote Sens. 9, 1110 (2017). [CrossRef]  

8. M. Zhu, D. Huang, X. Hu, W. Tong, B. Han, J. Tian, and H. Luo, “Application of hyperspectral technology in detection of agricultural products and food: a review,” Food Sci. Nutr. 8, 5206–5214 (2020). [CrossRef]  

9. G. J. Edelman, E. Gaston, T. G. van Leeuwen, P. J. Cullen, and M. C. G. Aalders, “Hyperspectral imaging for non-contact analysis of forensic traces,” Forensic Sci. Int. 223, 28–39 (2012). [CrossRef]  

10. J. Kuula, I. Pölönen, H.-H. Puupponen, T. Selander, T. Reinikainen, T. Kalenius, and H. Saari, “Using VIS/NIR and IR spectral cameras for detecting and separating crime scene details,” Proc. SPIE 8359, 83590 (2012). [CrossRef]  

11. N. Gat, “Imaging spectroscopy using tunable filters: a review,” Proc. SPIE 4056, 50–64 (2000). [CrossRef]  

12. Y. Lu, C. Fredembach, M. Vetterli, and S. Susstrunk, “Designing color filter arrays for the joint capture of visible and near-infrared images,” in IEEE Conference on Image Processing (ICIP) (Institute of Electrical and Electronics Engineers, Cairo, Egypt, 2009, pp. 3797–3800.

13. L. Kong, S. Sprigle, D. Yi, F. Wang, C. Wang, and F. Liu, “Developing handheld real time multispectral imager to clinically detect erythema in darkly pigmented skin,” Proc. SPIE 7557, 75570G (2010). [CrossRef]  

14. J. Zhang, S. Lin, C. Zhang, Y. Chen, L. Kong, and F. Chen, “An evaluation method of a micro-arrayed multispectral filter mosaic,” Proc. SPIE 8759, 875908 (2013). [CrossRef]  

15. V. Dworak, J. Selbeck, K. H. Dammer, M. Hoffmann, A. A. Zarezadeh, and C. Bobda, “Strategy for the development of a smart NDVI camera system for outdoor plant detection and agricultural embedded systems,” Sensors 13, 1523–1538 (2013). [CrossRef]  

16. Z. Chen, X. Wang, and R. Liang, “RGB-NIR multispectral camera,” Opt. Express 22, 4985–4994 (2014). [CrossRef]  

17. E. Laux, C. Genet, T. Skauli, and T. W. Ebbesen, “Plasmonic photon sorters for spectral and polarimetric imaging,” Nat. Photonics 2, 161–164 (2008). [CrossRef]  

18. S. Yokogawa, S. P. Burgos, and H. A. Atwater, “Plasmonic color filters for CMOS image sensor applications,” Nano Lett. 12, 4349–4354 (2012). [CrossRef]  

19. A. Miyamichi, A. Ono, K. Kagawa, K. Yasutomi, and S. Kawahito, “Plasmonic color filter array with high color purity for CMOS image sensors,” Sensors 19, 1750 (2019). [CrossRef]  

20. N. Danz, B. Höfer, E. Förster, T. Flügel-Paul, T. Harzendorf, P. Dannberg, R. Leitel, S. Kleinle, and R. Brunner, “Miniature integrated micro-spectrometer array for snap shot multispectral sensing,” Opt. Express 27, 5719–5728 (2019). [CrossRef]  

21. D. Baruch and D. Abookasis, “Multimodal optical setup based on spectrometer and cameras combination for biological tissue characterization with spatially modulated illumination,” J. Biomed. Opt. 22, 046007 (2017). [CrossRef]  

22. H. Zeng, M. Petek, M. T. Zorman, A. McWilliams, B. Palcic, and S. Lam, “Integrated endoscopy system for simultaneous imaging and spectroscopy for early lung cancer detection,” Opt. Lett. 29, 587–589 (2004). [CrossRef]  

23. H. Zeng, A. McWilliams, and S. Lam, “Optical spectroscopy and imaging for early lung cancer detection: a review,” Photodiag. Photodyn. Therapy 1, 111–122 (2004). [CrossRef]  

24. P. Thapa, V. Singh, S. Bhatt, S. Tayal, P. Mann, K. Maurya, and D. S. Mehta, “Development of multimodal micro-endoscopic system with oblique illumination for simultaneous fluorescence imaging and spectroscopy of oral cancer,” J. Biophotonics 15, e202100284 (2022). [CrossRef]  

25. T. Hegemann, F. Bürger, and J. Pauli, “Combined high-resolution imaging and spectroscopy system-a versatile and multi-modal metrology platform,” in International Conference on Photonics, Optics and Laser Technology (2017), pp. 215–222.

26. S. Johnson, M. Clemenson, and N. Glumac, “Simultaneous imaging and spectroscopy of detonation interaction in reactive and energetic materials,” Appl. Spectrosc. 71, 78–86 (2017). [CrossRef]  

27. D. O. Baskurt, Y. Bastanlar, and Y. Y. Cetin, “Catadioptric hyperspectral imaging, an unmixing approach,” IET Comput. Vis. 14, 493–504 (2020). [CrossRef]  

28. L. Feng, X. He, Y. Li, L. Wei, Y. Nie, J. Jing, and J. Zhou, “Compact shortwave infrared imaging spectrometer based on a catadioptric prism,” Sensors 22, 4611 (2022). [CrossRef]  

29. H. Gross, F. Bechinger, and B. Achtner, “Photographic lenses,” in Handbook of Optical Systems (Wiley, 2008), Vol. 4, pp. 260–262.

30. https://www.theimagingsource.com/de-de/product/optic/low-distortion/.

31. https://www.zeiss.com/spectroscopy/products/spectrometer-modules/mms.html.

32. https://www.theimagingsource.com/de-de/product/board/37u/dfm37ux178ml/.

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (10)

Fig. 1.
Fig. 1. Optical design of the sensor’s interlaced spectroscopic (red) and imaging (blue) beam paths. The objective lens O creates an intermediate image at the entrance surface of a beam splitter cube BS which coincides with the target structure of a reticle R. The sensor has the ability to address very different object distances ( ${\sim}{25}\;{\rm mm}$ to infinity) and from narrow to wide adjustable viewing angles.
Fig. 2.
Fig. 2. (a) Subdivision of the observed scene into different areas: s central area used for spectroscopic selection, ${{d}_x} \times {{d}_y}$ area of high spatial resolution, ${{r}_x} \times {{r}_y}$ image area with reduced resolution, ${{c}_x} \times {{c}_y}$ size of detector chip; (b) numbers 1–8 indicating different field positions. In between and below (a) and (b): spot diagrams for the respective field positions and corresponding Airy disk for reference (Airy disk radius: 10.1 µm). The gray values below each spot diagram give the RMS spot radii for the corresponding spot diagrams. Spot diagrams are shown for four different wavelengths from the blue visible to the near-infrared range (486 nm, 587 nm, 656 nm, and 852 nm).
Fig. 3.
Fig. 3. Polychromatic modulation transfer functions (MTFs) for different field positions (number of field positions refers to Fig. 2). Both tangential (solid lines) and sagittal (dotted lines) orientations are considered for the different field positions. For reference, the diffraction-limited case is included (black line).
Fig. 4.
Fig. 4. Simulated keystone distortion in the anamorphic imaging path for area of high spatial resolution (left) and larger area with reduced resolution (right). Small blue dots illustrate the local distortion as a deviation from the underlying rectangular mesh structure. A maximum distortion of 3.04% is found for the high-resolution area and 6.53% for the wider range.
Fig. 5.
Fig. 5. (a) Photo of the final assembled sensor without the protective cover, (b) CAD drawing with the individual components highlighted in color. 1, objective lens; 2, reticle and entrance surface of beam splitter; 3, M1 mirror for spectroscopic path; 4, fiber to spectrometer; 5, 6, 7, mirrors (M2–M4) for imaging path; 8, detector; 9, MMS1.
Fig. 6.
Fig. 6. Test measurements and correction of the imaging path. (a) Uncorrected test image of a square mesh structure. The central, high-resolution area is indicated (red triangles; blue boundary lines). (b) Distortion and anamorphically corrected image.
Fig. 7.
Fig. 7. Calibration of the spectroscopic channel. A calibration standard was illuminated by a HgCd lamp and the reflected light was mapped onto the fiber input of the spectrometer. The spectrum shows the prominent lines of the HgCd source. The inset shows the simultaneously captured image of the scene with the calibration standard in the center.
Fig. 8.
Fig. 8. (a)–(f) Photos of several colored, equally sized, and unsorted plastic caps taken with the imaging path of the sensor. In each of the images a cap with a different color is in the center. (g) Simultaneously recorded spectra for each of the different colored caps in the central area. The spectra differ significantly. Although the light red and the red cap in the center of images (e) and (f) appear very similar, the respective spectral measurements allow a clear distinction.
Fig. 9.
Fig. 9. (a)–(c) RGB images of three objects that appear green in visual perception located at commutated positions (a natural green bell pepper, a spinach leaf, and an artificial apple). (d) For comparison a near-infrared image taken of the scene which was acquired with the setup by replacing the IR block filter by a long-pass filter blocking the visible light (cutoff wavelength at 740 nm). (e) The reflectance spectra of all three objects show a “green-peak” at approximately 550 nm. At around 680 nm and 710 nm, a significant spectral difference occurs between artificial and natural objects due to the “red edge” effect.
Fig. 10.
Fig. 10. Four photos taken of the same bananas on four consecutive days show no clearly discernible difference. However, the spectra recorded at the same moment can be clearly distinguished (e.g., the local minimum at about 680 nm becomes weaker from day to day).

Tables (2)

Tables Icon

Table 1. Optical Parameters, Distances and Dimensions for All Optical Elements Involved, for Both the Spectroscopic and Imaging Paths (Starting from the Intermediate Image)

Tables Icon

Table 2. Schematic Classification of Different Areas of the Observed Scene for both the Intermediate Image Plane and the Final Detection Plane [Indication See Fig. 2(a)]

Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.