Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Full-color light-field microscopy via single-pixel imaging

Open Access Open Access

Abstract

Light-field microscopy is a scanless volumetric imaging technique. Conventional color light microscope employs a micro-lens array at the image plane and samples the spatial, angular, and color information by a pixelated two-dimensional (2D) sensor (such as CCD). However, the space bandwidth product of the pixelated 2D sensor is a fixed value determined by its parameters, leading to the trade-offs between the spatial, angular, and color resolutions. In addition, the inherent chromatic aberration of the micro-lens array also reduces the viewing quality. Here we propose full-color light-field microscopy via single-pixel imaging that can distribute the sampling tasks of the spatial, angular, and color information to both illumination and detection sides, rather than condense on the detection side. Therefore, the space bandwidth product of the light-field microscope is increased and the spatial resolution of the reconstructed light-field can be improved. In addition, the proposed method can reconstruct full-color light-field without using a micro-lens array, thereby the chromatic aberration induced by the micro-lens array is avoided. Because distributing the three sampling tasks to both the illumination and detection sides has different possible sampling schemes, we present two sampling schemes and compare their advantages and disadvantages via several experiments. Our work provides insight for developing a high-resolution full-color light-field microscope. It may find potential applications in the biomedical and material sciences.

© 2020 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

1. Introduction

Color light-field microscopy is a computational optical imaging technique that can achieve scanless, color three-dimensional (3D) imaging [1]. By introducing a micro-lens array at the image plane of a conventional microscope, it is possible to record the spatial, angular, and color information of a specimen by a pixelated sensor (such as CCD) [2,3]. This raw spatial, angular, and color information can be post-processed to reconstruct a color full 3D volume of the specimen slice by slice from a series of rendered images with different depths of focus.

However, the conventional color light-field microscope suffers from trade-offs between the spatial, angular, and color resolutions [4], as the spatial, angular, and color information are simultaneously sampled by the pixelated 2D sensor, whereas the space bandwidth product of the pixelated sensor is a fixed value determined by its parameters [5]. Although the resolution trade-offs can be alleviated by using a camera array [6], the camera array is bulky and expensive, which limits its application range. Another inherent disadvantage is that each micro-lens is a single plano-convex lens, thus it is difficult to obtain full-color light-field without chromatic aberration. Although the chromatic aberration can be overcome by using metalens array [7,8], fabrication of metalens array is still expensive and challenging. More importantly, using metalens array cannot address the resolution trade-off problem of the color light-field microscope. In short, the space bandwidth product of a pixelated 2D sensor is a fixed value determined by its parameters. If the spatial, angular, and color information are simultaneously sampled from the detection side by a pixelated 2D sensor, the spatial resolution of the reconstructed image has to be sacrificed to sample the angular and color information. However, if we can sample information partially from the detection side and partially from the illumination side, the overall space bandwidth product of the color light-field microscope will be increased. Then the trade-offs between the spatial, angular, and color resolution can be alleviated, and thus, the spatial resolution of the reconstructed light-field image could be increased.

Single-pixel imaging is an emerging imaging technique that can sample the spatial information of a scene from the illumination side [915]. It works by illuminating the specimen with a set of basis patterns (e.g., Hadamard or Fourier basis patterns), then records the modulated information of an object with a single-pixel detector. Based on the light signals recorded by the single-pixel detector, one can computationally reconstruct an image of the object. Using multiple single-pixel detectors on the detection side, one can simultaneously sample additional information without sacrificing the spatial resolution of the reconstructed image. Examples include two single-pixel detectors for simultaneously visible and infrared imaging [16], three single-pixel detectors for color imaging [17], four single-pixel detectors for 3D imaging [18], and single-pixel detector array for light-field microscopy [1921]. In addition, multiple single-pixel detectors can also be used to increase the frame rate of the single-pixel imaging system [22].

In this paper, we present full-color light-field microscopy via single-pixel imaging, which enables sampling of the spatial, angular, and color information partially from the illumination side and partially from the detection side. Compared with the conventional color light-field microscope, the final achievable space bandwidth product is increased. Therefore, we can obtain more spatial, angular, and color information, and the spatial resolution of the reconstructed images can be improved. Furthermore, the proposed technique enables reconstructing the full-color light-field without using a micro-lens arrays, thereby the chromatic aberration induced by the micro-lens array can be avoided. Noted that distributing the three sampling tasks to both the illumination and detection sides can be realized by using multiple possible sampling schemes. However, considering that the spatial and angular resolutions are far higher than the color resolution, we distribute the spatial sampling tasks to the illumination side and distribute the angular sampling task to the detection side. Under this setting, there are two possible sampling schemes to sample the color information: either from the detection side or from the illumination side. Both of the two sampling schemes have different advantages and disadvantages. To guide the reader to choose the appropriate sampling schemes for a specific application, we compare the advantages and disadvantages of the two system schemes by using several experiments. The proposed technique, where the illumination and detection sides share the sampling task, provides a way of developing a high resolution, full-color light-field microscope.

2. The first sampling scheme

Let us begin with the first sampling scheme. In it, we sample the spatial information of a specimen from the illumination side with structured illuminations and sample the angular and color information from the detection side with a color camera.

2.1 Experimental setup and measurement principle

Figure 1 shows the layout of the proposed color light-field microscope. Light coming from a white light emitted diode (LED) is directed onto a digital micromirror device (DMD: 0.69 inches, ${1}, {024} \times {768}$ pixels, pixel size: ${13}.6$ µm) by a reflecting mirror. We then load a set of Fourier basis patterns on the DMD. We choose Fourier basis patterns for sampling the spatial information of the specimen in our experiment because the patterns are more efficient [23]. The use of Hadamard basis patterns for sampling the spatial information is also a candidate. By using a tube lens (Thorlabs: TTL200, focal length: ${200\,{\textrm {mm}}}$) and an objective lens, the Fourier basis patterns are projected onto a specimen in sequence. With the Fourier basis patterns, the spatial information of the specimen is encoded into a one-dimensional (1D) time-varying intensity sequence. The 1D intensity sequence is recorded by a color 2D sensor (Point Grey: GS3-U3-60QS6C-C, 1 inch CCD, ${2},{736} \times 2,192$ pixels, pixel size: ${4}.54$ µm), where each pixel of the color 2D sensor is used as a single-pixel detector. Based on the intensity sequence recorded from every single-pixel detector, we can recover an image of the specimen by using the single-pixel reconstruction algorithm [11]. In addition, owing to the use of a color 2D sensor, each pixel of color 2D sensor actually obtains three groups of 1D intensity sequences, as shown in Figs. 2(a1)–2(a3), corresponding to red, green, and blue channels, respectively. Therefore, based on the three sets of 1D intensity sequences, we can recover three images of the specimen from every pixel of the color 2D sensor, as shown in Figs. 2(b1)–2(b3). Based on the three images, we can synthesize a color image of the specimen, as shown in Fig. 2(c).

 figure: Fig. 1.

Fig. 1. Schematic diagram of the experimental setup used for generating Fourier basis patterns and imaging transmitted specimens. (RFP: rear focal plane)

Download Full Size | PDF

 figure: Fig. 2.

Fig. 2. (a1)–(a3) Three sets of 1D intensity sequences recorded by the center single-pixel detector, corresponding to red, green, and blue channels, respectively; (b1)–(b3) red, green, and blue perspective images recovered from the 1D intensity sequences; (c) color-perspective image synthesized by Figs. 2(b1)–2(b3).

Download Full Size | PDF

Apart from recording the spatial and color information, the proposed system also allows recording the angular information of the light rays, because the color 2D sensor records the light signals at the rear focal plane of the objective lens 2 via a 4f relay system. Accordingly, we can record a bright spot image by the color 2D sensor, as shown in the right side of Fig. 1, where each pixel of the color 2D sensor can record the angular information of the light rays. In short, using the proposed system, we not only can sample the spatial information of a specimen from the illumination side but also can sample the color and angular information from the detection side. Based on the sampled information, we can recover a color-perspective image of the specimen from every pixel of the color 2D sensor.

2.2 Algorithm for reconstructing color light-field images

Based on the color-perspective images recovered from all the single-pixel detectors, we can computationally reconstruct the full-color 3D volume of a specimen slice by slice from a series of rendered images with different depths of focus [19,20]. The reconstruction algorithm is summarized as follows.

  • (1) Determining the radius R and the center coordinate $({{x_{center}},{y_{center}}} )$ of the bright spot by using the circle detection and fitting algorithms [24].
  • (2) Determine the pixel coordinates $({{x_i},{y_i}} )$ for every single-pixel detector, where ${i}$ is the number of the single-pixel detector.
  • (3) Calculate the distance $({\Delta {x_i},\Delta {y_i}} )$ between the ${i^{th}}$ single-pixel detector and the center coordinate $({{x_{center}},{y_{center}}} )$ of the bright spot by using the following equation:
    $$\left\{ {\begin{array}{c} {\Delta {x_i} = {x_i} - {x_{center}}}\\ {\Delta {y_i} = {y_i} - {y_{center}}} \end{array}} \right..$$
  • (4) Calculate the incident angle $({{\theta_{xi}},{\theta_{yi}}} )$ for every single-pixel detector by using the following equation:
    $$\left\{ {\begin{array}{c} {{\theta_{xi}} = \arctan \left( {\frac{{\Delta {x_i}\cdot \tan ({\arcsin ({{{NA} \mathord{\left/ {\vphantom {{NA} n}} \right.} n}} )} )}}{R}} \right)}\\ {{\theta_{yi}} = \arctan \left( {\frac{{\Delta {y_i}\cdot \tan ({\arcsin ({{{NA} \mathord{\left/ {\vphantom {{NA} n}} \right.} n}} )} )}}{R}} \right)} \end{array}} \right.,$$
    where $NA$ is the numerical aperture of the objective, n represents the refractive index of the medium between the objective lens 1 and the specimen.
  • (5) According to the refocused depth $\Delta \textrm{z}$, the shift amount for every single-pixel detector can be calculated by using the following equation:
    $$\left\{ {\begin{array}{c} {\Delta {s_{xi}} = \frac{{M\cdot \Delta z\cdot \tan ({{\theta_{xi}}} )}}{{{s_{DMD}}}} = \frac{{M\cdot \Delta z\cdot \Delta {x_i}\cdot \tan ({\arcsin ({{{NA} \mathord{\left/ {\vphantom {{NA} n}} \right.} n}} )} )}}{{{s_{DMD}}\cdot R}}}\\ {\Delta {s_{yi}} = \frac{{M\cdot \Delta z\cdot \tan ({{\theta_{yi}}} )}}{{{s_{DMD}}}} = \frac{{M\cdot \Delta z\cdot \Delta {y_i}\cdot \tan ({\arcsin ({{{NA} \mathord{\left/ {\vphantom {{NA} n}} \right.} n}} )} )}}{{{s_{DMD}}\cdot R}}} \end{array}} \right. ,$$
    where M is the magnification of the microscope and ${s_{DMD}}$ is the effective pixel size of the Fourier basis patterns.
  • (6) Based on the shifts calculated from step (5), we shift all the perspective images one by one. For convenience, every perspective image is shifted in the Fourier domain via the phase-shifting factor ${e^{2\pi ({\Delta {s_{xi}}\cdot {f_x} + \Delta {s_{yi}}\cdot {f_y}} )}}$, where $({{f_x},{f_y}} )$ is the frequency coordinates.
  • (7) All the shifted images are summed together to obtain the focused image of the specimen at $\Delta z$.

Note to obtain the color refocused image, here we process the red, green, blue images individually with step (6) and step (7). After that, we combine the red, green, and blue refocused images to obtain the color refocused images. Figure 3 also summarizes the whole steps on how to reconstruct the full-color 3D volume of a specimen by using the first sampling scheme.

 figure: Fig. 3.

Fig. 3. Reconstruction procedure of full-color light-field microscopy by using Fourier basis patterns and color camera.

Download Full Size | PDF

2.3 Experimental results of transmitted specimen.

We first validated the first sampling scheme by imaging a biological specimen. In the experiment, a 3-watt white light-emitting diode (LED) was used to illuminate the specimen. The size of the Fourier basis pattern is ${384} \times {384}$ pixels. The effective pixel size of the Fourier basis patterns is ${27}.2$ µm. The focal length of the tube lens was ${200\,{\textrm{mm}}}$. On the illumination side, we employed a ${10} \times$, numerical aperture (NA) 0.25 objective lens in our platform. On the detection side, we used a ${20} \times$, NA 0.4 objective lens to collect the light emitting from the specimen. The focal lengths of lens 1 and lens 2 lenses were ${100\,{\textrm{mm}}}$ and ${150\,{\textrm{mm}}}$, respectively. To increase the measurement speed, we converted the Fourier basis patterns into binary Fourier basis patterns with the dither algorithm [25], and loaded the binary Fourier basis patterns on the DMD, as shown on the left side of Fig. 1.

To realize color light-field microscopy, we built 73 circular single-pixel detectors from the color 2D sensor. The layout of the circular single-pixel detectors is shown on the right side of Fig. 1. The radius of every single-pixel detector is 50 pixels. The number and radius of single-pixel detectors can be digitally adjusted for different applications. The size of the reconstructed image is ${384} \times {384}$ pixels. Figure 4 shows four color-perspective images recovered from four different single-pixel detectors. It can be seen that the perspective view changes with the change of the position of the single-pixel detector, allowing us to disambiguate the superimposed feature.

 figure: Fig. 4.

Fig. 4. (a1)–(a4) Locations of the single-pixel detectors; (b1)–(b4) color-perspective images retrieved from the rightmost, topmost, leftmost, and bottommost single-pixel detectors, respectively.

Download Full Size | PDF

Based on all the color-perspective images, we can refocus the specimen at different depths by using the digital refocusing algorithm mentioned in subsection 2.2. Figure 5 shows three color images digitally refocused at different depths. To show the entire digital refocusing process from $- 100$ µm to $100$ µm, we also synthesized a Media (see Visualization 1) based on all the refocusing images. These experimental results suggest that by using the first scheme, we can reconstruct the full-color light-field for a transmitted specimen. Furthermore, these experimental results confirm that the chromatic aberration induced by the micro-lens array can be avoided, as the proposed scheme enables reconstructing full-color light-field without using the micro-lens array.

 figure: Fig. 5.

Fig. 5. Digitally refocused images (Visualization 1). (a) $z ={-} 77$ µm; (b) $z ={-} 10$ µm; (c) $z = 37$ µm.

Download Full Size | PDF

2.4 Experimental results of reflective specimen

By slightly modifying the proposed system, our scheme can also be used for reflective specimens. Figure 6 shows the experimental setup for a reflective specimen, where the illumination unit is the same as that of Fig. 1. The detection unit was modified for imaging the reflective specimen. A Nikon $20 \times$, NA 0.4 objective lens was used for illumination and detection. The focal lengths of the 4f relay system were ${150\,{\textrm{mm}}}$. The color 2D sensor was the same as those used in Section 2.2. The size of the reconstructed image is ${384} \times {384}$ pixels.

 figure: Fig. 6.

Fig. 6. Schematic diagram of the experimental setup used for generating Fourier basis patterns and imaging reflective specimen. (RFP: rear focal plane; BS: beam splitter)

Download Full Size | PDF

We tested the proposed scheme as a reflective full-color light-field microscope by imaging an integrated circuit chip. For convenience, we still built 73 single-pixel detectors from the color 2D sensor as before. The layout of the single-pixel detectors is the same as that shown in Fig. 1. The measurement procedure is the same as that used for the transmission experiment. Figure 7 shows four color-perspective images retrieved from different single-pixel detectors. These experimental results demonstrate that by using the first sampling scheme, we can reconstruct multiple color-perspective images for a reflective specimen.

 figure: Fig. 7.

Fig. 7. (a1)–(a4) Locations of the single-pixel detectors; (b1)–(b4) color-perspective images retrieved from the rightmost, topmost, leftmost, and bottommost single-pixel detectors, respectively.

Download Full Size | PDF

Based on all the color-perspective images retrieved above, we can refocus the specimen at different depths with different color images by using the digital refocusing algorithm mentioned in subsection 2.2. Figure 8 shows three color-images that are digitally refocused at three different depths. To show the entire digital refocusing process from $- 50$ µm to $50$ µm, we also synthesized a Media (see Visualization 2) based on all the refocusing images. These experimental results confirm that by using the first sampling scheme, we can also reconstruct the full-color light-field for a reflective specimen.

 figure: Fig. 8.

Fig. 8. Digitally refocused images (Visualization 2). (a) $z ={-} 14$ µm; (b) $z = 15$ µm; (c) $z = 38$ µm.

Download Full Size | PDF

3. The second sampling scheme

We also present another sampling scheme to realize full-color light-field imaging. Different from the first sampling scheme, we sample both the spatial and color information from the illumination side with color-coded structured illumination and sample the angular information from the detection side with a monochrome camera.

3.1 Experimental setup and measurement principle

Figure 9 shows the experimental system, where we use a projector to generate the color-coded Fourier basis pattern. The color-coded Fourier basis patterns are generated by multiplying the Fourier basis patterns with a color mask [26], as shown in Fig. 10. Meanwhile, we employ a monochrome 2D sensor to record the light signals emitting from the specimen, as shown in Fig. 9. Each pixel of the monochrome 2D sensor is used as a single-pixel detector. Particularly, the monochrome 2D sensor records the light signals at the rear focal plane of the objective lens via a 4f relay system. In this case, we not only can sample the spatial and color information of the specimen from the illumination side but also can sample the angular information of the light rays from the detection side via the monochrome 2D sensor. As a result, every pixel of the monochrome 2D sensor can record a 1D intensity sequence, as shown in Fig. 11(a). Based on the light signals recorded by every pixel of the monochrome 2D sensor, we can reconstruct a mosaic-like perspective image of the specimen, as shown in Figs. 11(b) and 11(c). After applying the de-mosaicing algorithm to the mosaic-like image [27], we can recover a color-perspective image of the specimen, as shown in Fig. 11(d). Repeating the process for all the pixel of the monochrome 2D sensor, we can retrieve multiple color-perspective images. Using all the color-perspective images, we can reconstruct a full-color 3D volume of the specimen by using the digital refocusing algorithm. The imaging procedure is summarized in Fig. 12.

 figure: Fig. 9.

Fig. 9. Schematic diagram of the experimental setup used for generating color-coded Fourier basis patterns and imaging transmitted specimens. (RFP: rear focal plane)

Download Full Size | PDF

 figure: Fig. 10.

Fig. 10. Principle of generating color-coded Fourier basis pattern [25].

Download Full Size | PDF

 figure: Fig. 11.

Fig. 11. (a) 1D intensity sequence recorded by the center single-pixel detector; (b) mosaic-like image recovered from the 1D intensity sequence; (c) partially enlarged view of Fig. 11(b); (d) color image recovered from Fig. 11(b).

Download Full Size | PDF

 figure: Fig. 12.

Fig. 12. Reconstruction procedure of full-color light-field microscopy by using color-coded Fourier basis patterns and monochrome camera.

Download Full Size | PDF

3.2 Experimental results of transmitted specimen

We tested the performance of the proposed scheme by imaging a slice of an ant. In the experiment, we used a digital projector (Anhua, F10, 0.47 inches DMD, $1,920 \times 1,080$ pixels, pixel size: $7.6$ µm) to generate the color-coded Fourier basis patterns. The size of the color-coded Fourier basis pattern is also set as ${384} \times {384}$ pixels. The effective pixel size of the color-coded Fourier basis patterns is ${15}.2$ µm. To record the light signals emitting from the specimen, we used a monochrome 2D sensor (Basler, pia2400-17gm, 2/3 inches CCD, ${2},{458} \times 2,056$ pixels, pixel size: ${3}.45$ µm) on the detection side. The other devices are the same as those used in Section 2.2.

To demonstrate the multi-perspective color imaging capability, we built 73 circular single-pixel detectors from the camera. The layout of the 73 single-pixel detectors is the same as that shown in Fig. 1. The radius of every single-pixel detector is 50 pixels. Figure 13 shows four color-perspective images recovered from different single-pixel detectors. These experimental results demonstrate that using the second scheme, we can also reconstruct multiple color-perspective images. Compared with the perspective images reconstructed by using the first sampling scheme, as shown in Fig. 4, the spatial resolution of the perspective images is much lower, because the color and spatial information are simultaneously sampled from the illumination side and recording the color information needs to sacrifice the spatial resolution. However, the reconstruction time of the second sampling scheme is 3 times less than the reconstruction time of the first sampling image, because we only need to retrieve one perspective image from every single-pixel detector.

 figure: Fig. 13.

Fig. 13. (a1)–(a4) Locations of the single-pixel detectors; (b1)–(b4) perspective images retrieved from the rightmost, topmost, leftmost, and bottommost single-pixel detectors, respectively.

Download Full Size | PDF

Based on all the color-perspective images retrieved above, we refocused the specimen at different depths by using the digital refocusing algorithm mentioned in subsection 2.2. Figure 14 shows three images that are digitally refocused at different depths. To show the entire digital refocusing process from $- 100$ µm to $100$ µm, we also synthesized a Media (see Visualization 3) based on all the refocusing images. These experimental results suggest that using the second scheme, we can also reconstruct the full-color light-field for a transmitted specimen. Compared with the refocused image retrieved by the first sampling scheme, as shown in Fig. 5, the spatial lateral resolution of the refocused image is lower because the resolution of the refocused image is determined by the resolution of the perspective image, whereas the resolution of the perspective image retrieved by the second sampling scheme is lower than that of the first sampling scheme.

 figure: Fig. 14.

Fig. 14. Digitally refocused images (Visualization 3). (a) $z ={-} 53$ µm; (b) $z = 22$ µm; (c) $z = 59$ µm.

Download Full Size | PDF

3.3 Experimental results of reflective specimen

Similar to the first scheme, the second scheme can also be modified for imaging a reflective specimen. Figure 15 shows the modified experimental setup, where the layout of the proposed system is similar to the layout shown in Fig. 6, except that we used a digital projector to generate the color-coded Fourier basis patterns and employed a monochrome camera to record the light intensity emitting from the specimen. The projector and the monochrome camera were the same as those used in Section 3.2.

 figure: Fig. 15.

Fig. 15. Schematic diagram of the experimental used for generating color-code Fourier basis patterns and imaging reflective specimen. (RFP: rear focal plane; BS: beam splitter)

Download Full Size | PDF

To demonstrate the performance of the proposed system, we imaged an integrated circuit chip. For comparison, we still built 73 single-pixel detectors from the monochrome 2D sensor. The layout of the single-pixel detectors was the same as that in Fig. 1. Figure 16 shows four color-perspective images of the specimen recovered from different single-pixel detectors. From these images, we can see that by using the second sampling scheme, we can also reconstruct multiple color-perspective images of a reflective specimen. Compared with the perspective images retrieved by using the first sampling scheme, as shown in Fig. 7, the spatial lateral resolution of the reconstructed image is decreased because the spatial and color information are simultaneously sampled from the illumination side and recording the color information has to sacrifice the spatial lateral resolution of the reconstructed image.

 figure: Fig. 16.

Fig. 16. (a1)–(a4) Locations of the single-pixel detectors; (b1)–(b4) perspective images retrieved from the rightmost, topmost, leftmost, and bottommost single-pixel detectors, respectively.

Download Full Size | PDF

Based on all the color-perspective images, we can refocus the specimen at different depths by using the digital refocusing algorithm mentioned in subsection 2.2. Figure 17 shows three images that are digitally refocused at different depths. To show the entire digital refocusing process from $- 50$ µm to $50$ µm, we also synthesized a Media (see Visualization 4) based on all the refocusing images. These experimental results indicate that by using the second scheme, we can reconstruct the full-color light-field for a reflective specimen. Similarly, compared with the refocused image retrieved by using the first sampling scheme, as shown in Fig. 8, the spatial lateral resolution of the refocused image is much lower, because the resolution of the refocused image is determined by the resolution of the perspective image, whereas the resolution of the perspective image reconstructed by using the second sampling scheme is lower than the resolution of the perspective image reconstructed by using the first sampling scheme.

 figure: Fig. 17.

Fig. 17. Digitally refocused images (Visualization 4). (a) $z ={-} 22$ µm; (b) $z ={-} 1$ µm; (c) $z = 15$ µm.

Download Full Size | PDF

4. Discussion

4.1 Comparison of the spatial lateral resolutions between the two proposed sampling schemes

Note in the first sampling scheme, we use a DMD to generate the Fourier basis pattern, whereas in the second sampling scheme we used a projector to generate the color-coded Fourier basis pattern. The pixel size of the DMD is different from that of the projector. Therefore, the effective pixel size of the used Fourier basis patterns is different from each other, and we cannot conduct an experiment for quantitative comparison. Here we theoretically compare the spatial lateral resolution of the images reconstructed by using the two proposed sampling schemes. When analyzing the lateral resolution of the reconstructed image, we need to take both the geometric optics and wave optics effects into account.

In the geometric optics study, we assume that the two proposed imaging systems are free of aberrations. Under this hypothesis, the resolution is determined by the effective pixel size of the Fourier basis patterns because the spatial information is sampled by the Fourier basis patterns. Then, according to the Nyquist sampling theorem, the geometric resolution limit, defined as the smallest separation between two points on an object that can be distinguished, is given by

$${\rho _{geo}} = 2\frac{{{p_{Fp}}}}{M}.$$
where ${p_{Fp}}$ is the effective pixel size of the used Fourier basis patterns, M is the magnification of the projection objective.

In the wave optics study, when analyzing the resolution, we must consider the numerical aperture (NA) of the used objective and the wavelength of the light source. According to the Rayleigh criterion, the shortest distance between two object points that can be resolved:

$${\rho _{dif}} = 0.61\frac{\lambda }{{NA}}.$$
where $\lambda $ is the wavelength of the light source.

When considering both the geometric and wave optics effects, the lateral resolution of the perspective image is given by

$${\rho _{pers\_img}} = \max \{{{\rho_{geo}},\textrm{ }\rho {}_{dif}} \}.$$
As the refocused images are obtained by shifting and summing all the perspective images, the lateral resolution of the refocused images is determined by the resolution of the perspective image. Although there is some gain in resolution with the increase of the refocusing depth due to the entanglement between perspective image, this gain is never larger than a factor of 2 [28,29]. Therefore, the spatial lateral resolution of the refocused image is in the range defined in the interval:
$${\rho _{refoc}} \in \left[ {\frac{{{\rho_{pers\_img}}}}{{2}},\textrm{ }{\rho_{pers\_img}}} \right].$$
For quantitative comparison, we assume that the effective pixel size of the used Fourier basis patterns is ${27}.2$ µm, the numerical aperture of the used projection objective is 0.4, the magnification of the projection objective is 20, the wavelength of the light source is ${0}.633$ µm. If we used the first sampling scheme, according to Eq. (4), the resolution governed by the effect of the geometric optic is ${2}.72$ µm. Meanwhile, according to Eq. (5), the resolution governed by the effect of the wave optics is ${0}.97$ µm. Accordingly, when considering both the geometric and wave optics effects, according to Eq. (6), the lateral resolution of the reconstructed perspective image is ${2}.72$ µm. Finally, according to Eq. (7), the spatial lateral resolution of the refocused image is in the range defined in the interval: [1.36 µm, 2.72 µm].

If we used the second sampling scheme, according to Eq. (4), the resolution governed by the effect of the geometric optic is ${2}.72$ µm. However, since the color and spatial information are simultaneously sampled from the illumination side by using the color-coded Fourier basis patterns, the real resolution governed by the effect of the geometric optic decreases by 2 times and becomes to ${5}.44$ µm. Because we assume that the projection objective and the wavelength is the same as before, the resolution governed by the wave optics effect is also ${0}.97$ µm, Consequently, when considering both the geometric and wave optics effects, the spatial lateral resolution of the reconstructed perspective image is 5.44 µm. Finally, according to Eq. (7), the spatial lateral resolution of the refocused image is in the range defined in the interval: [2.72 µm, 5.44 µm]. Based on the theoretical analysis result given above, we can conclude that under the same experimental condition, the spatial lateral resolution of the image reconstructed by using the first sampling scheme is higher than the spatial lateral resolution of the image reconstructed by using the second sampling scheme.

4.2 Comparison between the two proposed sampling schemes and the color light-field microscope based on the micro-lens array

We compare the proposed sampling schemes and the color light-field microscope based on a micro-lens array from the space bandwidth product, chromatic aberration, and imaging time, respectively.

Space bandwidth product: Spatial resolution is often used to evaluate the performance of an imaging system. However, for the color light-field microscope, the spatial, angular, and color resolutions are associated and restricted mutually. Therefore, it is inadvisable to compare the spatial resolution individually between the color light-field microscope based on a micro-lens array and the two proposed systems. Here we choose the space bandwidth product for comparison because it is often used to represent the overall information capacity of an imaging system [30]. It is easy to find that the main difference between the light-field microscope based on a micro-lens array and the two proposed systems are the sampling schemes. Hence, we assume that the objective used in the three light-field microscopes are equal. Meanwhile, we assume that the space bandwidth product of the objective is larger than the sampling devices. Under these hypotheses, the space bandwidth product of the color light-field microscope based on a micro-lens array is determined by the pixelated 2D sensor, whereas the space bandwidth product of the two proposed systems are determined by the pixelated 2D sensor and the DMD. Obviously, the overall space bandwidth products of the two proposed systems are larger than that of the color light-field microscope based on a micro-lens array. Therefore, using the two proposed sampling schemes, we can obtain more spatial, angular, and color information, and the resolutions of the spatial, angular, and color will also be improved, compared with the color light-field microscope based on a micro-lens array.

Chromatic aberration: It has been mentioned in [7,8] that the micro-lens array will introduce chromatic aberration. Since the two proposed sampling schemes did not use the micro-lens array, the chromatic aberration introduced by the micro-lens array can be avoided. In addition, all the lenses used in the proposed systems are achromatic lenses. Hence, the chromatic aberration of the perspective images and the refocused images will be very small.

Imaging time: The imaging time includes recording time and reconstruction time. Using the light-field microscope based on a micro-lens array, it only requires recording one image. But for the two sampling schemes proposed here, it suffers from long recording time, due to the use of single-pixel imaging. Using a high-speed DMD, an array of bucket detectors and multi-channel data acquisition cards may help reduce the recording time of the proposed systems.

The reconstruction time of the two proposed sampling schemes are much longer than that of the color light-field microscope based on a micro-lens array because each perspective image is reconstructed by using the single-pixel imaging method. Particularly, when using the first sampling scheme, it requires reconstructing three perspective images from every single-pixel detector. In contrast, when using the second sampling scheme, it only needs to reconstruct one perspective image from every single-pixel detector. Therefore, compared with the second sampling scheme, the first sampling scheme requires more reconstruction time. Using graphical processing units (GPUs) may help reduce the reconstruction time.

5. Conclusion

In summary, we report two schemes for full-color, achromatic light-field microscopic imaging. Experimental results demonstrate that both schemes can reconstruct full-color light-field without micro-lens array, thereby chromatic aberration induced by the micro-lens array can be avoided. In addition, we can conclude from the experimental results that the first scheme enables reconstructing higher resolution light-field than the second scheme as it is capable of recording the color information without sacrificing the spatial resolution. Thus, for applications where high-resolution full-color light-field microscopic imaging is concerned, the first scheme is the primary selection. We also conclude that the second scheme is more efficient than the first scheme as it only requiring recovering a single perspective image from every single-pixel detector. Therefore, for applications where efficiency is concerned, the second scheme is the primary selection. The two proposed schemes are validated by imaging transmitted specimens and reflective specimens, thereby, both schemes may find potential applications in biomedical and material sciences.

Funding

National Natural Science Foundation of China (61875074, 61605126, 61605063); Guangdong Provincial Applied Science and Technology Research and Development Program (2019A1515011151); National College Students Innovation and Entrepreneurship Training Program (201910559048).

Disclosures

The authors declare no conflicts of interest.

References

1. G. Wu, B. Masia, S. Jarabo, Y. Zhang, L. Wang, Q. Dai, T. Chai, and Y. Liu, “Light Field Image Processing: An Overview,” IEEE J. Sel. Top. Signal Process. 11(7), 926–954 (2017). [CrossRef]  

2. M. Levoy, R. Ng, A. Adams, M. Footer, and M. Horowitz, “Light field microscopy,” ACM T. Graphic. 25(3), 924–934 (2006). [CrossRef]  

3. M. Levoy, Z. Zhang, and I. McDowall, “Recording and controlling the 4D light field in a microscope using microlens arrays,” J. Microsc. 235(2), 144–162 (2009). [CrossRef]  

4. M. Broxton, L. Grosenick, S. l. Yang, N. Cohen, A. Andalman, K. Deisseroth, and M. Levoy, “Wave optics theory and 3-D deconvolution for the light field microscope,” Opt. Express 21(21), 25418–25439 (2013). [CrossRef]  

5. J. W. Goodman, Introduction to Fourier Optics (Roberts, 2005).

6. X. Lin, J. Wu, G. Zheng, and Q. Dai, “Camera array based light field microscopy,” Biomed. Opt. Express 6(9), 3179–3189 (2015). [CrossRef]  

7. R. Lin, V. Su, S. Wang, M. Chen, T. Chung, Y. Chen, H. Kuo, J. Chen, J. Chen, Y. Huang, J. Wang, C. Chu, P. Wu, T. Li, Z. Wang, S. Zhu, and D. Tsai, “Achromatic metalens array for full-colour light-field imaging,” Nat. Nanotechnol. 14(3), 227–231 (2019). [CrossRef]  

8. Z. Fan, H. Qiu, H. Zhang, X. Pang, L. Zhou, L. Liu, H. Ren, Q. Wang, and J. Dong, “A broadband achromatic metalens array for integral imaging in the visible,” Light: Sci. Appl. 8(1), 67 (2019). [CrossRef]  

9. M. F. Duarte, M. A. Davenport, D. Takhar, J. N. Laska, T. Sun, K. F. Kelly, and R. G. Baraniuk, “Single-pixel imaging via compressive sampling,” IEEE Signal Proc. Mag. 25(2), 83–91 (2008). [CrossRef]  

10. W. L. Chan, K. Charan, D. Takhar, K. F. Kelly, R. G. Baraniuk, and D. M. Mittleman, “A single-pixel terahertz imaging system based on compressed sensing,” Appl. Phys. Lett. 93(12), 121105 (2008). [CrossRef]  

11. Z. Zhang, X. Ma, and J. Zhong, “Single-pixel imaging by means of Fourier spectrum acquisition,” Nat. Commun. 6(1), 6255 (2015). [CrossRef]  

12. M. Sun, M. Edgar, G. Gibson, B. Sun, N. Radwell, R. Lamb, and M. Padgett, “Single-pixel three-dimensional imaging with time-based depth resolution,” Nat. Commun. 7(1), 12010 (2016). [CrossRef]  

13. D. Phillips, M. Sun, J. Taylor, M. Edgar, S. Barnett, and G. Gibson, “Adaptive foveated single-pixel imaging with dynamic supersampling,” Sci. Adv. 3(4), e1601782 (2017). [CrossRef]  

14. M. P. Edgar, G. M. Gibson, and M. J. Padgett, “Principles and prospects for single-pixel imaging,” Nat. Photonics 13(1), 13–20 (2019). [CrossRef]  

15. M. Sun and J. Zhang, “Single-pixel imaging and its application in three-dimensional reconstruction: a brief review,” Sensors 19(3), 732 (2019). [CrossRef]  

16. N. Radwell, K. J. Mitchell, G. M. Gibson, M. P. Edgar, R. Bowman, and M. J. Padgett, “Single-pixel infrared and visible microscope,” Optica 1(5), 285–289 (2014). [CrossRef]  

17. S. S. Welsh, M. P. Edgar, R. Bowman, P. Jonathan, B. Sun, and M. J. Padgett, “Fast full-color computational imaging with single-pixel detectors,” Opt. Express 21(20), 23068–23074 (2013). [CrossRef]  

18. B. Sun, M. P. Edgar, R. Bowman, L. E. Vittert, S. Welsh, A. Bowman, and M. J. Padgett, “3D computational imaging with single-pixel detectors,” Science 340(6134), 844–847 (2013). [CrossRef]  

19. M. Yao, J. Cheng, Z. Huang, Z. Zhang, S. Li, J. Peng, and J. Zhong, “Reflection light-field microscope with a digitally tunable aperture by single-pixel imaging,” Opt. Express 27(23), 33040–33050 (2019). [CrossRef]  

20. J. Peng, M. Yao, J. Cheng, Z. Zhang, S. Li, G. Zheng, and J. Zhong, “Micro-tomography by single-pixel imaging,” Opt. Express 26(24), 31094–31105 (2018). [CrossRef]  

21. R. Usami, T. Nobukawa, M. Miura, N. Ishii, E. Watanabe, and T. Muroi, “Dense parallax image acquisition method using single-pixel imaging for integral photography,” Opt. Lett. 45(1), 25–28 (2020). [CrossRef]  

22. M. Sun, W. Chen, T. Liu, and L. Li, “Image retrieval in spatial and temporal domains with a quadrant detector,” IEEE Photonics J. 9(5), 3901206 (2017). [CrossRef]  

23. Z. Zhang, X. Wang, G. Zheng, and J. Zhong, “Hadamard single-pixel imaging versus Fourier single-pixel imaging,” Opt. Express 25(16), 19619–19639 (2017). [CrossRef]  

24. R. Gonzalez and R. Woods, Digital imaging processing (Addison-Wesley, Massachusetts, 1992).

25. Z. Zhang, X. Wang, G. Zheng, and J. Zhong, “Fast Fourier single-pixel imaging via binary illumination,” Sci. Rep. 7(1), 12029 (2017). [CrossRef]  

26. Z. Zhang, S. Liu, J. Peng, M. Yao, G. Zheng, and J. Zhong, “Simultaneous spatial, spectral, and 3D compressive imaging via efficient Fourier single-pixel measurements,” Optica 5(3), 315–319 (2018). [CrossRef]  

27. H. S. Malvar, L. W. He, and R. Cutler, “High-quality linear interpolation for demosaicing of Bayer-patterned color images,” in IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP, 2004), pp. 485–488.

28. H. Navarro, E. Sánchez-Ortiga, G. Saavedra, A. Llavador, A. Dorado, M. Martínez-Corral, and B. Javidi, “Non-homogeneity of lateral resolution in integral imaging,” J. Disp. Technol. 9(1), 37–43 (2013). [CrossRef]  

29. M. Martínez-Corral and B. Javidi, “Fundamentals of 3D imaging and displays: a tutorial on integral imaging, light-field, and plenoptic systems,” Adv. Opt. Photonics 10(3), 512–566 (2018). [CrossRef]  

30. H. Wang, Z. Göröcs, W. Luo, Y. Zhang, Y. Rivenson, L. Bentolila, and A. Ozcan, “Computational out-of-focus imaging increases the space–bandwidth product in lens-based coherent microscopy,” Optica 3(12), 1422–1429 (2016). [CrossRef]  

Supplementary Material (4)

NameDescription
Visualization 1       Light-field refocusing of a transmitting specimen by using a while LED and structured patterns
Visualization 2       Light-field refocusing of a reflective specimen by using structured patterns and color 2D sensor
Visualization 3       light-field refocusing of transmitted specimen by using color-coded structured patterns and monochromatic camera
Visualization 4       light-field refocusing of a reflective specimen by using color-coded structured patterns and monochromatic camera

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (17)

Fig. 1.
Fig. 1. Schematic diagram of the experimental setup used for generating Fourier basis patterns and imaging transmitted specimens. (RFP: rear focal plane)
Fig. 2.
Fig. 2. (a1)–(a3) Three sets of 1D intensity sequences recorded by the center single-pixel detector, corresponding to red, green, and blue channels, respectively; (b1)–(b3) red, green, and blue perspective images recovered from the 1D intensity sequences; (c) color-perspective image synthesized by Figs. 2(b1)–2(b3).
Fig. 3.
Fig. 3. Reconstruction procedure of full-color light-field microscopy by using Fourier basis patterns and color camera.
Fig. 4.
Fig. 4. (a1)–(a4) Locations of the single-pixel detectors; (b1)–(b4) color-perspective images retrieved from the rightmost, topmost, leftmost, and bottommost single-pixel detectors, respectively.
Fig. 5.
Fig. 5. Digitally refocused images (Visualization 1). (a) $z ={-} 77$ µm; (b) $z ={-} 10$ µm; (c) $z = 37$ µm.
Fig. 6.
Fig. 6. Schematic diagram of the experimental setup used for generating Fourier basis patterns and imaging reflective specimen. (RFP: rear focal plane; BS: beam splitter)
Fig. 7.
Fig. 7. (a1)–(a4) Locations of the single-pixel detectors; (b1)–(b4) color-perspective images retrieved from the rightmost, topmost, leftmost, and bottommost single-pixel detectors, respectively.
Fig. 8.
Fig. 8. Digitally refocused images (Visualization 2). (a) $z ={-} 14$ µm; (b) $z = 15$ µm; (c) $z = 38$ µm.
Fig. 9.
Fig. 9. Schematic diagram of the experimental setup used for generating color-coded Fourier basis patterns and imaging transmitted specimens. (RFP: rear focal plane)
Fig. 10.
Fig. 10. Principle of generating color-coded Fourier basis pattern [25].
Fig. 11.
Fig. 11. (a) 1D intensity sequence recorded by the center single-pixel detector; (b) mosaic-like image recovered from the 1D intensity sequence; (c) partially enlarged view of Fig. 11(b); (d) color image recovered from Fig. 11(b).
Fig. 12.
Fig. 12. Reconstruction procedure of full-color light-field microscopy by using color-coded Fourier basis patterns and monochrome camera.
Fig. 13.
Fig. 13. (a1)–(a4) Locations of the single-pixel detectors; (b1)–(b4) perspective images retrieved from the rightmost, topmost, leftmost, and bottommost single-pixel detectors, respectively.
Fig. 14.
Fig. 14. Digitally refocused images (Visualization 3). (a) $z ={-} 53$ µm; (b) $z = 22$ µm; (c) $z = 59$ µm.
Fig. 15.
Fig. 15. Schematic diagram of the experimental used for generating color-code Fourier basis patterns and imaging reflective specimen. (RFP: rear focal plane; BS: beam splitter)
Fig. 16.
Fig. 16. (a1)–(a4) Locations of the single-pixel detectors; (b1)–(b4) perspective images retrieved from the rightmost, topmost, leftmost, and bottommost single-pixel detectors, respectively.
Fig. 17.
Fig. 17. Digitally refocused images (Visualization 4). (a) $z ={-} 22$ µm; (b) $z ={-} 1$ µm; (c) $z = 15$ µm.

Equations (7)

Equations on this page are rendered with MathJax. Learn more.

{ Δ x i = x i x c e n t e r Δ y i = y i y c e n t e r .
{ θ x i = arctan ( Δ x i tan ( arcsin ( N A / N A n n ) ) R ) θ y i = arctan ( Δ y i tan ( arcsin ( N A / N A n n ) ) R ) ,
{ Δ s x i = M Δ z tan ( θ x i ) s D M D = M Δ z Δ x i tan ( arcsin ( N A / N A n n ) ) s D M D R Δ s y i = M Δ z tan ( θ y i ) s D M D = M Δ z Δ y i tan ( arcsin ( N A / N A n n ) ) s D M D R ,
ρ g e o = 2 p F p M .
ρ d i f = 0.61 λ N A .
ρ p e r s _ i m g = max { ρ g e o ,   ρ d i f } .
ρ r e f o c [ ρ p e r s _ i m g 2 ,   ρ p e r s _ i m g ] .
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.