Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Single-shot sequential projection phase retrieval and 3D localization from chromatic aberration

Open Access Open Access

Abstract

A phase retrieval method based on sequential projection and chromatic aberration is reported. Construction of this method includes a red, green and blue (RGB) LED, an objective and a color camera. Owing to the chromatic aberration characteristics of the objective, three color images obtained by the color camera correspond to three equivalent propagation planes. Equivalent relative distances among these planes can be obtained by defining and iteratively minimizing the convergence index. Then, sequential projection strategy is used for phase retrieval in each image plane. Based on the recovered phase information and angular spectrum propagation principle, digital refocusing and 3D localization can be achieved for each subregion of the sample. Finally, the feasibility of our method is demonstrated by simulations and experiments.

© 2020 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

1. Introduction

Optical wave, consisting of an amplitude and a phase, is a powerful tool for microscopy [1]. Given that most biological samples are transparent, only phase information changes after the light has transmitted through these samples. On this basis, phase carries more information about a sample’s structure and composition than amplitude. However, a phase can only be measured on through intensity measurements because it has extremely fast oscillations, which cannot be directly detected. Phase imaging is a method for recovering phase information from intensity measurements, which have been an endeavor for microscopy since the early 1900s [29], and can be classified into two categories. One of the categories involves the acquisition of the phase contrast image of an object. An example of this category is differential interference contrast [4,10], limited increase information of samples. The other category is quantitative phase imaging, which is used in quantizing the phases of objects. This category includes holographic methods based on interference [1113], Shack-Hartman wave-front sensing [14], transport of intensity equation [1517], and coherent diffraction imaging. However, these methods have one or more restrictions, such as sensitive to disturbance, complex system implementation, and high light source coherence requirements etc.

Owing to the recent development in computer science, multiple iterations can be performed effectively in a short time. Thus, methods based on iterative projection and reflection among multiple measurements have been developed. For example, ptychographic iterative engine [1822] (PIE) using position diversity and Fourier ptychographic imaging [2327] (FPM) utilizing angular diversity have been implemented and extended. However, the implementation and operation processes of PIE and FPM remain complex. Similar to position and angle diversity, defocus distance diversity can be utilized for phase retrieval [28]. Conventional way to realize defocus diversity is shifting the sample or camera along the optical axis and taking images simultaneously. Nevertheless, high-precision mechanical movement introduce problems such as complex and costly system and time-consuming. In fact, defocus diversity can also be achieved using the aberration characteristics of objectives [29]. In this method, images at different defocus distances can be obtained through chromatic aberration instead of precise mechanical adjustment. Although this strategy can greatly reduce the complexity of the implementation and operation processes of a system, distances among RGB planes are generally on the order of microns and even in nanometers and thus difficult to obtain. Furthermore, the subregions of samples are usually not in a same plane perpendicular to the optical axis, and subregions at different positions may have different defocus distances. Therefore, refocusing the subregions to respective positions is more reasonable than refocusing the entire sample to a single plane, like the conventional refocusing methods.

In this paper, we present a single-shot phase retrieval and 3D localization strategy based on sequential projection. The system can be easily constructed with an RGB LED light source, a telecentric objective, and a color camera. We demonstrate the accuracy of the proposed method by simulations and experiments. This paper is organized as follows. The sequential projection phase retrieval principle is presented in Section 2.1. How the three-color images correspond to different propagation distances from chromatic aberration are discussed in Section 2.2. Color channel cross-talk elimination and 3D localization methods are presented in Section 2.3 and 2.4, respectively. Section 3 presents the simulations conducted to prove the effectiveness of the proposed method. We experimentally demonstrate the success of phase retrieval and 3D localization in Section 4 using a USAF and a paramecium specimen as samples. Finally, conclusions are summarized in Section 5.

2. Principle and method

2.1 Sequential projection

Sequential projection algorithm, a kind of optimization strategy, is utilized in performing phase information recovery on intensity images [3032]. It is implemented by imposing several supports or constrains alternatively. As shown in Fig. 1(a), the simplest case involves two constraints, A and B, named alternating projection, and the intersection point of the constraints indicates the optimal solution. Starting with the initial guess ${x_0}$ and projecting the intermediate solution between A and B alternately, the optimal solution ${x^\ast }$ can be found. In Fig. 1(a), (A) and B are convex sets in ${R^n}$, whereas ${P_A}$ and ${P_B}$ denote projection on convex sets A and B, respectively. This whole procedure can be mathematically expressed as follows:

$${y_k} = {P_B}({x_k}), {x_{k + 1}} = {P_A}({y_k}),\textrm{ k = 0,1,2,}\ldots.$$

 figure: Fig. 1.

Fig. 1. Geometric representation of alternating projection. (a) Alternative projecting between two convex sets; (b) Situation involves nonconvex set.

Download Full Size | PDF

In the case shown in Fig. 1(a), the acquisition of the optimal solution is guaranteed because both A and B are convex constraints. However, as shown in Fig. 1(b), if non-convex constraints are involved, then the alternating projection between two constraints may lead to local optimal solution instead of the global one.

An effective way to improve the probability of jumping out of the local optimal solution in Fig. 1(b) is to introduce additional constraints and implement alternative projection among these constraints. When the constraint sets are more than two, this algorithm will be called “sequential projection” or “cyclic projection”. As shown in Fig. 2(b), sequential projection among constraints A, B, and C can jump out of the local optimal solution shown in Fig. 2(a) and reach the global optimal solution.

 figure: Fig. 2.

Fig. 2. Geometric representation comparison between (a) alternating projection and (b) sequential projection.

Download Full Size | PDF

The phase retrieval problem refers to the recovery of the phase information of microscopic sample from intensity measurements. Images that propagate at different distances can be used as different constraints corresponding to those illustrated in Fig. 1 and Fig. 2. The complex amplitude of the light field at one position can be written as:

$$O(\overrightarrow r ) = A \cdot {e^{(j \cdot \overrightarrow k \cdot \overrightarrow r )}}.$$

Where A and ${e^{(j \cdot \overrightarrow k \cdot \overrightarrow r )}}$ refer to the amplitude and phase, respectively; $\textrm{k} = \frac{{2\pi }}{\lambda }$ is the wave vector; $\overrightarrow r$ indicates the three -dimensional position. To obtain three intensity constraints, we can propagate the light field thrice and record the corresponding intensity images. For example, we record the images at the position of ${z_1}$, ${z_2}$ and ${z_3}$, during the propagation of $O(z)$. The raw data from the obtained intensity images can be expressed by Eq. (3).

$${I_i} = {\left| {A_i \cdot {e^{(j \cdot \overrightarrow {{k_{{z_i}}}} \cdot {z_i})}}} \right|^2}, \quad i = 1,2,3.$$
where, ${I_1}, {I_2}$, and ${I_3}$ represent the constraints of the convex sets A, B, and C in Fig. 2(b), respectively. Therefore, the sequential projection for phase retrieval can be expressed as follows:
  • Step 1: initially guess the object: $O(x,y) = \sqrt {{I_1}(x,y)} \cdot {e^{(j \cdot \varphi )}}$ at plane ${z_1}$;
  • Step 2: project $O(x,y)$ onto the plane ${z_2}$ corresponding to constraint B to obtain ${O_2}(x,y) = \sqrt {{I_l}} \cdot {e^{(j \cdot {\varphi _l}_2)}}$, replace amplitude with the measurement $\sqrt {{I_2}}$ at $z_2$, keep the phase unchanged, and obtain ${O_2}(x,y) = \sqrt {{I_2}} \cdot {e^{(j \cdot {\varphi _l}_2)}}$;
  • Step 3: project ${O_2}(x,y)$ onto the plane ${z_3}$, constrain C in Fig. 2(b), and replace amplitude with the $\sqrt {{I_3}}$, keep the phase unchanged, and obtain ${O_3}(x,y) = \sqrt {{I_3}} \cdot {e^{(j \cdot {\varphi _l}_3)}}$;
  • Step 4: project ${O_3}(x,y)$ onto the plane ${z_1}$, constrain A, and replace the amplitude part with $\sqrt {{I_1}}$, keep the phase unchanged, and obtain ${O_1}(x,y) = \sqrt {{I_1}} \cdot {e^{(j \cdot {\varphi _l}_1)}}$;
  • Step 5: repeat Steps 2, 3, and 4 until convergence occurs.

2.2 Three defocus planes from chromatic aberration

Following the sequential projection phase retrieval strategy mentioned in Section 2.1, intensity images at three different planes are needed for recovering the phase information of microscopic samples. As mentioned previously, instead of precise mechanical adjustment, which will greatly increase the cost and complexity of the system, chromatic aberration can be used to realized three intensity images corresponding to different propagation distances with the accompany of RGB light source and color camera. The optical schematic is shown in Fig. 3. An RGB LED (CMN, 470, 517, and 626 nm) is used as light source. Three wavelengths, namely, 470, 517, and 626 nm are emitted simultaneously. A telecentric objective with certain chromatic aberration amount (OPTO, TC23004, magnification 2x, NA≈0.1, FOV 4.25 mm×3.55 mm, working distance 56 mm) is used as imaging lenses. The multi-color intensity images of the object are captured with an industrial camera (FLIR, BFS-U3-31S4C-C). The three RGB images captured by RGB sensor located at ${z_0}$ represent three defocus images corresponding to single wavelength images recorded at ${z_1}$, ${z_2}$ and ${z_3}$. This prototype setup can reduce data acquisition time by taking one single-shot. Furthermore, because of non-mechanical control, it can also avoid the mechanical movement error and reduce the cost for system implementation.

 figure: Fig. 3.

Fig. 3. Schematic of phase retrieval system based on chromatic aberration.

Download Full Size | PDF

However, how to measure the relative distances among these equivalent planes is a critical problem. Conventional method measures these distances by shifting the camera three times along the optical axis and recording the corresponding images. However, these distances are on the order of microns or even submicron, which has quite high requirements for accurate mechanical movement.

Based on the sequential projection phase retrieval strategy, an iterative projection approach can be performed for the recovery of distances among different equivalent defocus planes. We define a convergence index to present the accumulated error of the difference between the calculated intensity distribution and the recorded one in each iteration [33]. The convergence index can be written as,

$$ConvergenceIndex = \sum\limits_i {\frac{{mean(\sqrt {{I_{mi}}} )}}{{\sum\nolimits_{x,y} {abs(\sqrt {{I_{li}}} - \sqrt {{I_{mi}}} )} }}}.$$
where ${I_{mi}}$ and ${I_{li}}$ are the calculated and recorded intensities of each color-channel image, respectively, x and y are the two-dimensional coordinates of images. Equivalent distances among defocus color images can be obtained by calculating the convergence indexes of several equidistant guess distances. The smaller the difference between ${I_{mi}}$ and ${I_{li}}$ is, the higher the corresponding convergence index will be, which means the more likely the distance is the correct one.

2.3 Color cross-talk elimination

Obtaining accurate raw data of RGB channels of a color camera is the precondition of our algorithm. The most common color industrial camera is realized by combining a Bayer filter and a complementary metal oxide semiconductor (CMOS) sensor. Nevertheless, the color separation performance of Bayer filters is imperfect, thereby leading to color cross-talk or color-leakage among the three-color channels (i.e., red, green, and blue) [34,35]. Accordingly, eliminating the color cross-talk among RGB channel images read directly from the color camera is a necessary procedure in the proposed method.

The process of RGB illumination and imaging can be mathematically expressed with the following matrix equation.

$$\left[ \begin{array}{l} {M_R}\\ {M_G}\\ {M_B} \end{array} \right] = \textrm{W} \cdot \left[ \begin{array}{l} {I_r}\\ {I_g}\\ {I_b} \end{array} \right] = \left[ \begin{array}{ccc} {R_r} &{R_g} &{R_b}\\ {G_r} &{G_g} &{G_b}\\ {B_r} &{B_g} &{B_b} \end{array} \right] \cdot \left[ \begin{array}{l} {I_r}\\ {I_g}\\ {I_b} \end{array} \right],$$
where matrix $\textrm{W}$ represents the color cross-talk between each color channel; ${M_R}$, ${M_G}$, and ${M_B}$ are the three color channel images captured by color camera, where the subscript $\textrm{R}$, $\textrm{G}$, and $\textrm{B}$ indicate the recorded red, green, and blue channels, respectively. On the right side of Eq. (5), ${I_r}$, ${I_g}$, ${I_b}$ are the illumination intensity corresponding to red, green, and blue colors, respectively. The cross-talk matrix of fixed color camera can be obtained experimentally by controlling the RGB illumination parameters and acquiring the color images simultaneously. The mathematical calculation process can be written as Eq. (6), where the superscript ‘−1’ refers to the inverse matrix.
$$\textrm{W} = \left[ \begin{array}{ccc} {R_r} &{R_g} &{R_b}\\ {G_r} &{G_g} &{G_b}\\ {B_r} &{B_g} &{B_b} \end{array} \right] = \left[ \begin{array}{l} {M_R}\\ {M_G}\\ {M_B} \end{array} \right] \cdot {\left[ \begin{array}{l} {I_r}\\ {I_g}\\ {I_b} \end{array} \right]^{ - 1}}.$$

The matrix $\textrm{W}$ for a camera is a set of fixed parameters. Hence, the color cross-talk need not be calibrated any more in the following experiments. The color camera we used is (FLIR, BFS-U3-31S4C-C). The matrix $\textrm{W}$ for the wavelengths of the RGB LED (470, 517, and 626 nm) is shown in Table 1.

Tables Icon

Table 1. Color cross-talk matrix W of FLIR camera at three wavelengths of used LED.

2.4 Digital refocusing and 3D localization

As shown in Fig. 3, the sensor plane is not coincident with any focal color image planes because a full focus is almost impossible to achieve. Defocus affects the sharpness of images and further reduces the imaging quality. In fact, we can realize digital refocus following angular spectrum propagation principle with the amplitude and phase information of sample at a certain defocus position. Even when the whole sample are not distributed in the same plane, for example, if a tilted USAF resolution chart is the sample to be measured, only a small subregion will be focused, whereas the others will be defocused in varying degrees, we can still realize digital refocus and subregion 3D localization by propagating the recovered complex information function following angular spectrum propagation principle.

In addition to the refocus strategy, the digital refocus and 3D localization process needs a criterion to determine whether the image after propagation for a specific distance is the focus image. In the field of photography, the clarity of image is always achieved by auto focus. The focused image is always sharper than the defocused one. Similarly, in our digital refocus process, the sharpness of the image, which can be represented by gradient function, is used as the focal plane image criterion. As shown in Eq. (7), the gradient-based evaluation function is defined as the square sum $G(x,y)$ at each pixel, where $G(x,y)$ is a gradient matrix obtained by calculating the convolution of intensity image with Laplace convolution kernel. Therefore, the value of F in Eq. (7) will have the maximum value for the in-focus image.

$$F = \sum\limits_x {\sum\limits_y {{G^2}(x,y)} },$$
The gradient function $G(x,y)$ can be written as:
$$G(x,y) = f(x,y) \otimes L.$$
where L is the Laplace operator, as shown in Eq. (9).
$$L = \left[ \begin{array}{rrr} 0 &1 &0\\ 1 &- 4 &1\\ 0 &1 &0 \end{array} \right].$$

3. Simulation experiment

Before applying the method to the actual experimental data, we initially evaluate its effectiveness by simulations. Consistent with the parameters in the experiments, the wavelengths of the illumination are set to 470, 517, and 626 nm. The pixel size of the color camera is 3.45 um. The illumination light from LED is considered as parallel light.

As shown in Fig. 4, we use a picture of a cameraman and a picture of rice as the amplitude and phase ground truth of object, respectively. The pixel size of both the amplitude and phase of the ground truth are $256 \times 256$. The equivalent defocus distances for each color channel are set as ${z_i} = [ - 70,10,60]um\;(i = 1,2,3)$. The raw color data of RGB channels we simulated is shown in Fig. 4(c). Curves of the convergence index relative to defocus distances are shown in Fig. 4(d). Then, the simulations are conducted with the followed procedures. The block diagram of the algorithm is shown in Fig. 5.

 figure: Fig. 4.

Fig. 4. Simulation results. (a) Simulated amplitude ground truth. (b) Simulated phase ground truth. (c) Intensity images at ${z_1} ={-} 70um$, ${z_2} = 10um$, and ${z_3} = 60um$., respectively. (d) Curves of the convergence indexes relative to defocus distances.

Download Full Size | PDF

 figure: Fig. 5.

Fig. 5. Flow diagram of sequential projection phase retrieval, digital refocus, and 3D localization method.

Download Full Size | PDF

First, the relative distances between three equivalent defocus planes is obtained using the proposed sequential projection-based algorithm and predefined convergence index. The three recorded color images are defocused. Thus, we can pick one channel randomly as the one whose phase distribution we will recovered. Without loss of generality, we choose the position corresponding to the blue channel as the reference plane. Then, the relative distances between the other colors defocus planes and the blue channel ${\textrm{d}_{bg}}$ and ${\textrm{d}_{br}}$ are obtained by the method stated in Sections 2.1 and 2.2. Given the limitation of defocus range in real situation, the range of the optimization approach can be limited within a certain one. We set a range for $|{{\textrm{d}_{bg}}} |\in [60,100]um$ and $|{{\textrm{d}_{br}}} |\in [110,150]um$ to reduce the computation cost. Afterwards, simulated annealing algorithm is used to search for the maximum convergence index corresponding to each true distance values.

Second, we initialize the complex function distribution at green channel image plane, which is expressed as $O(x,y) = \sqrt {{I_1}(x,y)} \cdot {e^{(j \cdot \varphi )}}$. Usually, the initial guess of amplitude can be part of the raw data we captured, and the phase can be a matrix of all ones.

Third, we calculate the convergence index corresponding to ${\textrm{d}_{bg}}$ and ${\textrm{d}_{br}}$ respectively by using alternating projection. The number of iterations is set as 30. After calculating the candidate distances, the curves of the convergence index relative to distances as shown in Fig. 4(d) are obtained. In accordance with the criterion that the maximum value indicates the true value, the distances between red and green channels relative to the blue channel image plane are $|{{\textrm{d}_{bg}}} |= 80um$ and $|{{\textrm{d}_{br}}} |= 130um$. These values are consistent with the true values set originally in the simulations. So, the ${z_i}$ we obtained is ${z_i} = [ - \textrm{8}0,0,\textrm{5}0]um (i = 1,2,3)$, which exists a 10-um error.

Fourth, in accordance with the relative distances among the three-color equivalent defocus image planes, the sequential projection approach can be implemented for the recovery of phase distribution information at the green channel image plane. The recovered complex function can be expressed as ${O_{updated}}(x,y) = \sqrt {{I_{updated}}(x,y)} \cdot {e^{(j \cdot {\varphi _{updated}})}}$. The amplitude and phase part are shown in Fig. 6(a) and Fig. 6(b), respectively. They are almost consistent with the ground truth shown in Fig. 4(a) and Fig. 4(b).

 figure: Fig. 6.

Fig. 6. Recovered amplitude and phase of the object by sequential projection phase retrieval.

Download Full Size | PDF

Finally, we propagate the recovered complex function to different distance and find the in-focus position using the criterion defined in Eq. (7). Supposing $\Delta {Z_{defocus}}$ is the defocus distance, we can set a range $\Delta {Z_{defocus}} \in [ - 50,50]um$ from experience. Then, we use the angular spectrum propagation to obtain the complex function at different positions of $\Delta {Z_{defocus}}$. By calculating the gradient based criterion defined in Eq. (7), we obtain a graph normalized between defocus distance and the sum of the squares of the gradients, as shown in Fig. 7. We choose two different subregions, whose pixel size are $64 \times 64$, to calculate the defocus distance through minimizing F defined in Eq. (7). As shown in Fig. 7(b2) and Fig. 7(c2), the peaks of both curves correspond to the 10um position, which is consistent with true value.

 figure: Fig. 7.

Fig. 7. Digital refocusing results. (a) Recovered amplitude of object. (b1) Enlarged view of the yellow boxed subregion in (a). (b2) Curve of the gradient Laplace function corresponding to (b1). (c1) Enlarged view of the red boxed subregion in (a). (c2) Curve of the gradient Laplace function corresponding to (c1).

Download Full Size | PDF

4. Results and discussion

To evaluate the effectiveness of our method experimentally, we use a USAF resolution chart and biological paramecium as samples.

4.1 USAF resolution chart

An amplitude USAF resolution chart is used as a sample to test the phase retrieval and 3D localization ability of the proposed method. We illuminate the sample with RGB light simultaneously and record the RGB images. The three channel raw intensity images are shown in Fig. 8(a), Fig. 8(b), and Fig. 8(c), respectively. With the previously obtained color cross-talk matrix, we obtain the pure red, green, and blue images. Then, we implement the phase retrieval and digital refocus.

 figure: Fig. 8.

Fig. 8. Raw color intensity images, (a) red channel, (b) green channel, and (c) blue channel.

Download Full Size | PDF

First, we adjust the USAF resolution chart perpendicular to the optical axis and demonstrate the effectiveness of the digital refocus approach. The chromatic aberration characteristics of the telecentric objective used in our prototype is shown in Fig. 9(a). It is obtained by calculating the sharpness of different color images during the adjustment of the camera relative to the objective. The order for each color channel focus plane from near to far away from the objective, is the green channel, the red channel, and the blue channel, which is different from order in the simulation. The chromatic aberration of objective is caused by different refractive index of the lens material for different wavelengths, which is called dispersion. In general, the dispersion curve for most material is monotonic with respect to the change in wavelength. Nevertheless, almost all the microscopic objectives are corrected for chromatic aberration at the factory. Chromatic aberration correction does not completely eliminate chromatic aberration, but usually only reduces the magnitude of chromatic aberration at two wavelengths. For the telecentric objective used in our system, the chromatic aberration between red light and green light is reduced. Therefore, the order of focal planes of different color channels in our results is “green, red and blue” instead of “green, blue and red”. Figures 9(b) and 9(c) show that the relative distance between equivalent green channel image plane and red channel image plane is ${\textrm{d}_{gr}} = 28um$, and that between equivalent green channel image plane and blue image plane is ${\textrm{d}_{gb}} = 386um$. These values are achieved by implementing the digital refocus approach stated in Sections 2.1 and 2.4. Furthermore, we tilt the amplitude-only sample to introduce a slope phase distribution. The recovered phase information of the tiled USAF chart is shown in Fig. 9(d), in which the slope phase distribution characteristic introduced by the tiled placement can be clearly seen.

 figure: Fig. 9.

Fig. 9. Experimental results of USAF. (a) ishe schematic diagram of chromatic aberration. (b) and (c) are the Convergence-Index curves verse to the distance between red channel plane and blue channel plane with the green channel plane. (d) is the recovered phase distribution of tiled USAF.

Download Full Size | PDF

4.2 Biological sample

Compared with resolution boards, where the phase characteristics is not evident and all features are distributed almost on the same plane, a biological sample is used in demonstrating the feasibility of the proposed method in phase recovery and 3D localization. The original color image and three demodulated monochromatic images are shown in Fig. 10(a), Fig. 10(b), Fig. 10(c), and Fig. 10(d), respectively. Figures 10(e) and 10(f) indicate the amplitude and phase distribution of the biological sample. The cell nucleus, which is difficult to identify in the raw intensity images, can be clearly seen in the enlargement of Fig. 10(f). The transmittance of the samples used in our experiments for different wavelengths are not exactly the same, which makes the color image looks more like a blue single channel image.

 figure: Fig. 10.

Fig. 10. The results of paramecium sample. (a) is the captured color image, (b), (c) and (d) are the RGB monochromatic images, (e) is the recovered amplitude, and (f) is the recovered phase distribution.

Download Full Size | PDF

We choose several typical information-rich subregions and perform digital refocusing. Figure 11 shows four sub-regions, in which three of them are cells. Defocus distances are also obtained: 18, 15, and 7 ums. Even no cells exist in the last subregion, the information inside is still sufficient to realize digital refocus. Therefore, we can realize 3D localization for all subregions of the biological sample. Figure 11(b) shows an example of the process of digital refocusing. It is a curve of convergence index defined in Eq. (4) relative with the propagation distance, and the peak indicates the true defocus value. Furthermore, on the basis of the complex function of image at certain plane, images at any position can be obtained by the digital refocusing process. Figure 11(c) shows an example of intensity images at different propagation distances. This result is also evidence of the effectiveness of digital refocusing.

 figure: Fig. 11.

Fig. 11. Digital refocusing results. (a) shows subregions at different defocus distances, (b) is the refocusing curve. (c) shows images at different positions through propagating.

Download Full Size | PDF

5. Conclusion

Phase imaging and refocusing are significant challenges in microscopy. Usually, phase information is lost during digital image recording. In most cases, the subregions of microscopic samples are located at different defocus planes, and simultaneously focusing of entire samples with conventional methods is impossible to achieve. In this paper, we propose and demonstrate a sequential projection phase retrieval and 3D localization method utilizing the chromatic aberration of the objective. Simulations and experiments are implemented to verify the feasibility of our method. Defocus position diversity is introduced by a chromatic aberration objective instead of precision mechanical movement. Then, the relative distances among each color channel images are calculated iteratively. The accurate distances obtained are used in recovering phase distribution information with a sequential projection algorithm. Digital refocusing and 3D localization are further achieved with the recovered phase information on the basis of the angular spectrum propagation principle.

An amplitude USAF resolution chart and biological sample are used as samples in experiments for verifying the feasibility of our method. The sequential projection algorithm used in this paper is Gerchberg-Saxton (GS). Other algorithms, such as difference map (DM) or relaxed averaged alternating reflection (RAAR), can be utilized in subsequent works [30]. These algorithms can increase the probability of jumping out of local optimum. The reflection mode of the proposed method is in progress. However, although this method is proven to be efficient, an important assumption that should be paid attention to is that the responses of a sample to different wavelengths is the same. Moreover, the effects of phase retrieval and digital refocus may decrease in these cases.

Funding

National Natural Science Foundation of China (61805011, 51735002); National Postdoctoral Program for Innovative Talents (BX201700035); China Postdoctoral Science Foundation (2017M620018).

Disclosures

The authors declare that there are no conflicts of interest related to this article.

References

1. M. Pluta and P. Maksymilian, “Advanced light microscopy,” Elsevier: Amsterdam, (1988).

2. W. Choi, C. Fang-Yen, K. Badizadegan, S. Oh, N. Lue, R. R. Dasari, and M. S. Feld, “Tomographic phase microscopy,” Nat. Methods 4(9), 717–719 (2007). [CrossRef]  

3. A. N. T. O. N. Barty, K. A. Nugent, D. Paganin, and A. Roberts, “Quantitative optical phase microscopy,” Opt. Lett. 23(11), 817–819 (1998). [CrossRef]  

4. C. J. Mann, L. Yu, C. M. Lo, and M. K. Kim, “High-resolution quantitative phase-contrast microscopy by digital holography,” Opt. Express 13(22), 8693–8698 (2005). [CrossRef]  

5. F. Zernike, “Phase contrast, a new method for the microscopic observation of transparent objects part II,” Physica 9(10), 974–986 (1942). [CrossRef]  

6. P. Marquet, B. Rappaz, P. J. Magistretti, E. Cuche, Y. Emery, T. Colomb, and C. Depeursinge, “Digital holographic microscopy: a noninvasive contrast imaging technique allowing quantitative visualization of living cells with subwavelength axial accuracy,” Opt. Lett. 30(5), 468–470 (2005). [CrossRef]  

7. J. Millerd, N. Brock, J. Hayes, M. North-Morris, B. Kimbrough, and J. Wyant, “Pixelated Phasemask Dynamic Interferometers,” Fringe 2005. Springer, Berlin, Heidelberg, 640–647(2006).

8. D. Paganin and K. A. Nugent, “Noninterferometric phase imaging with partially coherent light,” Phys. Rev. Lett. 80(12), 2586–2589 (1998). [CrossRef]  

9. Y. Tan, W. Wang, C. Xu, and S. Zhang, “Laser confocal feedback tomography and nano-step height measurement,” Sci. Rep. 3(1), 2971 (2013). [CrossRef]  

10. M. R. Arnison, K. G. Larkin, C. J. Sheppard, N. I. Smith, and C. J. Cogswell, “Linear phase imaging using differential interference contrast microscopy,” J. Microsc. 214(1), 7–12 (2004). [CrossRef]  

11. I. Yamaguchi and T. Zhang, “Phase-shifting digital holography,” Opt. Lett. 22(16), 1268–1270 (1997). [CrossRef]  

12. U. Schnars, C. Falldorf, J. Watson, and W. Jüptner, “Digital holography,” Digital Holography and Wavefront Sensing. Springer, Berlin, Heidelberg, 39–68 (2015).

13. S. Seo, T. W. Su, D. K. Tseng, A. Erlinger, and A. Ozcan, “Lensfree holographic imaging for on-chip cytometry and diagnostics,” Lab Chip 9(6), 777–787 (2009). [CrossRef]  

14. B. C. Platt and R. Shack, “History and principles of Shack-Hartmann wavefront sensing,” J. Refract. Surg. 17(5), S573–S577 (2001). [CrossRef]  

15. M. R. Teague, “Deterministic phase retrieval: a Green’s function solution,” J. Opt. Soc. Am. 73(11), 1434–1441 (1983). [CrossRef]  

16. L. Waller, L. Tian, and G. Barbastathis, “Transport of intensity phase-amplitude imaging with higher order intensity derivatives,” Opt. Express 18(12), 12552–12561 (2010). [CrossRef]  

17. C. Zuo, Q. Chen, and A. Asundi, “Boundary-artifact-free phase retrieval with the transport of intensity equation: fast solution with use of discrete cosine transform,” Opt. Express 22(8), 9220–9244 (2014). [CrossRef]  

18. J. M. Rodenburg and H. M. Faulkner, “A phase retrieval algorithm for shifting illumination,” Appl. Phys. Lett. 85(20), 4795–4797 (2004). [CrossRef]  

19. G. Li, W. Yang, H. Wang, and G. Situ, “Image transmission through scattering media using ptychographic iterative engine,” Appl. Sci. 9(5), 849 (2019). [CrossRef]  

20. R. Horstmeyer, R. Y. Chen, X. Ou, B. Ames, J. A. Tropp, and C. Yang, “Solving ptychography with a convex relaxation,” New J. Phys. 17(5), 053044 (2015). [CrossRef]  

21. A. M. Maiden, M. J. Humphry, F. Zhang, and J. M. Rodenburg, “Superresolution imaging via ptychography,” J. Opt. Soc. Am. A 28(4), 604–612 (2011). [CrossRef]  

22. J. M. Rodenburg, “Ptychography and related diffractive imaging method,” Adv. Imaging Electron Phys. 150, 87–184 (2008). [CrossRef]  

23. G. Zheng, R. Horstmeyer, and C. Yang, “Wide-field, high-resolution Fourier ptychographic microscopy,” Nat. Photonics 7(9), 739–745 (2013). [CrossRef]  

24. L. Tian, X. Li, K. Ramchandran, and L. Waller, “Multiplexed coded illumination for Fourier Ptychography with an LED array microscope,” Biomed. Opt. Express 5(7), 2376–2389 (2014). [CrossRef]  

25. C. Zuo, J. Sun, and Q. Chen, “Adaptive step-size strategy for noise-robust Fourier ptychographic microscopy,” Opt. Express 24(18), 20724–20744 (2016). [CrossRef]  

26. L. Bian, J. Suo, G. Zheng, K. Guo, F. Chen, and Q. Dai, “Fourier ptychographic reconstruction using Wirtinger flow optimization,” Opt. Express 23(4), 4856–4866 (2015). [CrossRef]  

27. A. Pan, K. Wen, and B. Yao, “Linear space-variant optical cryptosystem via Fourier ptychography,” Opt. Lett. 44(8), 2032–2035 (2019). [CrossRef]  

28. L. J. Allen and M. P. Oxley, “Phase retrieval from series of images obtained by defocus variation,” Opt. Commun. 199(1-4), 65–75 (2001). [CrossRef]  

29. L. Waller, S. S. Kou, C. J. Sheppard, and G. Barbastathis, “Phase from chromatic aberrations,” Opt. Express 18(22), 22817–22825 (2010). [CrossRef]  

30. S. Marchesini, “Invited article: A unified evaluation of iterative projection algorithms for phase retrieval,” Rev. Sci. Instrum. 78(1), 011301 (2007). [CrossRef]  

31. P. Netrapalli, P. Jain, and S. Sanghavi, “Phase retrieval using alternating minimization,” IEEE Trans. Signal Process. 63(18), 4814–4826 (2015). [CrossRef]  

32. J. R. Fienup, “Phase retrieval algorithms: a comparison,” Appl. Opt. 21(15), 2758–2769 (1982). [CrossRef]  

33. Z. Bian, S. Dong, and G. Zheng, “Adaptive system correction for robust Fourier ptychographic imaging,” Opt. Express 21(26), 32400–32410 (2013). [CrossRef]  

34. W. Lee, D. Jung, S. Ryu, and C. Joo, “Single-exposure quantitative phase imaging in color-coded LED microscopy,” Opt. Express 25(7), 8398 (2017). [CrossRef]  

35. Z. F. Phillips, M. Chen, and L. Waller, “Single-shot quantitative phase microscopy with color-multiplexed differential phase contrast (cDPC),” PLoS One 12(2), e0171228 (2017). [CrossRef]  

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (11)

Fig. 1.
Fig. 1. Geometric representation of alternating projection. (a) Alternative projecting between two convex sets; (b) Situation involves nonconvex set.
Fig. 2.
Fig. 2. Geometric representation comparison between (a) alternating projection and (b) sequential projection.
Fig. 3.
Fig. 3. Schematic of phase retrieval system based on chromatic aberration.
Fig. 4.
Fig. 4. Simulation results. (a) Simulated amplitude ground truth. (b) Simulated phase ground truth. (c) Intensity images at ${z_1} ={-} 70um$, ${z_2} = 10um$, and ${z_3} = 60um$., respectively. (d) Curves of the convergence indexes relative to defocus distances.
Fig. 5.
Fig. 5. Flow diagram of sequential projection phase retrieval, digital refocus, and 3D localization method.
Fig. 6.
Fig. 6. Recovered amplitude and phase of the object by sequential projection phase retrieval.
Fig. 7.
Fig. 7. Digital refocusing results. (a) Recovered amplitude of object. (b1) Enlarged view of the yellow boxed subregion in (a). (b2) Curve of the gradient Laplace function corresponding to (b1). (c1) Enlarged view of the red boxed subregion in (a). (c2) Curve of the gradient Laplace function corresponding to (c1).
Fig. 8.
Fig. 8. Raw color intensity images, (a) red channel, (b) green channel, and (c) blue channel.
Fig. 9.
Fig. 9. Experimental results of USAF. (a) ishe schematic diagram of chromatic aberration. (b) and (c) are the Convergence-Index curves verse to the distance between red channel plane and blue channel plane with the green channel plane. (d) is the recovered phase distribution of tiled USAF.
Fig. 10.
Fig. 10. The results of paramecium sample. (a) is the captured color image, (b), (c) and (d) are the RGB monochromatic images, (e) is the recovered amplitude, and (f) is the recovered phase distribution.
Fig. 11.
Fig. 11. Digital refocusing results. (a) shows subregions at different defocus distances, (b) is the refocusing curve. (c) shows images at different positions through propagating.

Tables (1)

Tables Icon

Table 1. Color cross-talk matrix W of FLIR camera at three wavelengths of used LED.

Equations (9)

Equations on this page are rendered with MathJax. Learn more.

y k = P B ( x k ) , x k + 1 = P A ( y k ) ,  k = 0,1,2, .
O ( r ) = A e ( j k r ) .
I i = | A i e ( j k z i z i ) | 2 , i = 1 , 2 , 3.
C o n v e r g e n c e I n d e x = i m e a n ( I m i ) x , y a b s ( I l i I m i ) .
[ M R M G M B ] = W [ I r I g I b ] = [ R r R g R b G r G g G b B r B g B b ] [ I r I g I b ] ,
W = [ R r R g R b G r G g G b B r B g B b ] = [ M R M G M B ] [ I r I g I b ] 1 .
F = x y G 2 ( x , y ) ,
G ( x , y ) = f ( x , y ) L .
L = [ 0 1 0 1 4 1 0 1 0 ] .
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.