Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Exploiting a holographic polarization microscope for rapid autofocusing and 3D tracking

Open Access Open Access

Abstract

We report a fast autofocusing and accurate 3D tracking scheme for a digital hologram (DH) that intrinsically exploits a polarization microscope setup with two off-axis illumination beams having different polarization. This configuration forms twin-object images that are recorded in a digital hologram by angular and polarization multiplexing technique. We show that the separation of the two images on the recording plane follows a linear relationship with the defocus distance and indicates the defocus direction. Thus, in the entire field of view (FOV), the best focus distance of each object can be directly retrieved by identifying the respective separation distance with a cross-correlation algorithm, at the same time, 3D tracking can be performed by calculating the transverse coordinates of the two images. Moreover, we estimate this linear relationship by utilizing the numerical propagation calculation based on a single hologram, in which the focus distance of one of the objects in the FOV is known. We proved the proposed approach in accurate 3D tracking through multiple completely different experimental cases, i.e., recovering the swimming path of a marine alga (tetraselmis) in water and fast refocusing of ovarian cancer cells under micro-vibration stimulation. The reported experimental results validate the proposed strategy’s effectiveness in dynamic measurement and 3D tracking without multiple diffraction calculations and any precise knowledge about the setup. We claim that it is the first time that a holographic polarization multiplexing setup is exploited intrinsically for 3D tracking and/or fast and accurate refocusing. This means that almost any polarization DH setup, thanks to our results, can guarantee accurate focusing along the optical axis in addition to polarization analysis of the sample, thus overcoming the limitation of the poor axial resolution.

© 2020 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

1. Introduction

A digital hologram (DH) is a label-free, non-contact, non-invasive, and high-resolution imaging technique that allows to retrieve the phase and amplitude distributions of the object wave. So far, it has gained credit as one of the most potent techniques for microscopic analysis in different configurations, including polarization configuration. Polarization configuration uses two beams with crossed polarization simultaneously in a multiplexing modality to investigate the birefringence properties of optical fiber [1], sperm cells [2], biological tissue [3], quantify the spatially-resolved Jones matrix of liquid crystal droplets [4,5] and biological specimens [6], perform the Mueller matrix imaging [7], and improve the discrimination capability of object recognition [8]. Moreover, the most attractive features of DH are its intrinsic flexibility of autofocusing and 3D tracking in dynamic scenes. It is mainly because DH could numerically propagate the object wave to different planes along the optical axis by using reconstruction algorithms [9]. Thus, the in-focus image of the object is retrieved by numerical refocusing [10], and, at the same time, the transverse coordinate of the object is evaluated through image segmentation [11,12]. Besides, it is interesting to note that 3D tracking could even be performed directly in a volume with both intensity and phase provided by holographic data without finding the in-focusing image [13,14].

In recent years, different strategies have been proposed to address autofocusing and 3D tracking in DH. More often, this problem is handled through sequential numerical reconstructions within an estimated distance range. Then, different refocusing metrics are used to compute the sharpness of each reconstructed image. Spatial-based criteria are diverse, e.g., self-entropy of the magnitude and phase [15], integrated amplitude modulus [16,17], grey level variance [18], square gradient and Laplacian of Gaussian [19,20], magnitude differential [21]. Frequency-based criteria have been proven effective, like summating the weighted band-pass-filtered power spectra [22]. Sparsity-based methods are also investigated widely, such as Tamura coefficient [23], Gini’s index [24], edge sparsity [25], and so on [2628]. Although effective, these approaches suffer from several limitations, including great computation demand and long-time consumption on account of multi-distance numerical reconstructions, especially when a shorter step is required after the coarse search to improve the accuracy. Moreover, they have relatively low localization accuracy due to axial elongation caused by the worse axial resolution than the lateral one. These restrictions make them cumbersome for dynamic measurements, in which hundreds of holograms whose focus distance are variable are recorded. Other strategies have also been studied to deal with autofocusing and 3D tracking for DH, aiming at improving the speed and accuracy. In recent years, method based on machine learning has been proved as a promising way to develop fast autofocusing methods for DH. T. Pitkaaho et al. employed the AlexNet architecture to estimate the focus position, which is found among the reconstructed images with different distances [29]. Z. Ren et al. treated autofocusing as a classification problem tackled by using a convolutional neural network (CNN), where the hologram as the input of network is predicted with the estimated focus distance [30]. Furthermore, Z. Ren et al. transformed the autofocusing to a regression problem, where the focus distance is regarded as a continuous response corresponding to each hologram [31]. J. Lee et al. proposed to reduce learning-time and to extract features without information loss by resizing the spectrum of a hologram to plug into CNN [32]. T. Shimobaba et al. proposed a CNN-based regression for depth prediction in DH, where the focus distance is predicted as a continuous value directly from holograms [33]. Although the machine learning approaches could directly estimate the correct focus distance without multiple diffraction calculations, the collecting of labels is usually a heavy workload, and the training procedure is time-consuming.

Schemes based on multiple illumination beams have been widely adopted in DH to handle the poor axial localization accuracy issue due to the axial elongation. F. Saglimbeni et al. improved the axial localization accuracy by overlapping the three numerical reconstructions obtained by tilted red, green, and blue illumination beams to decrease the elongation volume [34]. J. Öhman et al. averted the poor localization accuracy by recording side scatter light of the object with two identical cameras placed perpendicular to each other in the plane that is vertical to optical axis [35]. P. Gao et al. used two off-axis illuminations [36] and opposite-view illuminations [37] to determine the correct focus distance based on minimizing the variation between these reconstructed images. These methods could enhance the localization accuracy. However, all still require multi-distance numerical reconstructions. A. Ozcan et al. employed two angle illuminations at two different wavelengths in lens-less DH for 3D tracking [38]. P. Ferraro et al. proposed twin-beams illumination configure in off-axis DH for 3D tracking and phase imaging [39]. In these two methods, the position of the object is determined by the intersection point of two lines, which are defined by the positions of the two light sources and corresponding two object images. Therefore, they could recover the positions of objects by detecting the positions of object images and the localization accuracy is not limited by the axial resolution any more. However, the lines to determine the position of the object defined by light sources and object images are different for every object and need to be updated every time.

References [35,36] indicate that the axial position of the object is encoded into the positions of its two images when the object is illuminated by two off-axis beams and could be recovered from the two images. Inspired by this, we realize that the separation distance of the two images of the object follows a linear relationship with the focus distance of the object in DH with two off-axis beams configuration. Therefore, the focus distance of the object could be determined by identifying the separation distance of the two object images. This resembles the concept of phase-detection autofocusing in professional photography [40], where the object image light bundle is separated into two light bundles, which form two object images, and the image position is determined by comparing the separation distance of the latter two object images. Here, we report a novel fast and accurate autofocusing and 3D tracking scheme for each object in the FOV by simply exploiting the intrinsic properties of multiplexed polarization DH configuration. In fact, in a polarization-sensitive system, the separation distance of two object images follows a linear relationship with the focus distance in a polarization-sensitive system, when the object is illuminated with two orthogonal off-axis plane beams. The sign of defocus distance is indicated by the separation direction of two object images. Based on this linear relation, we could rapidly determine the focus distance by identifying the separation distance using cross-correlation analysis without multi-distance numerical reconstructions, so that autofocusing and 3D tracking could be implemented rapidly. Moreover, we estimate this linear relationship via a single hologram where the focus distance of one of the objects in FOV is known utilizing the numerical propagation calculation. We have performed three different experiments in which we demonstrated accurate 3D tracking by the proposed approach. We have considered completely diverse samples and experimental situations, i.e., sedimenting microspheres, swimming algas (tetraselmis), and the in-vitro ovarian cancer cells subjected to micro-vibration, respectively. The reported method requires no multi-distance reconstruction and no additional light source. It may provide a novel solution for dynamic observing for its rapidness, simplicity, and accuracy when a DH polarization setup is adopted.

2. Rapid autofocusing and 3D tracking scheme

The optical configuration of the proposed method is illustrated in Fig. 1. As shown in Fig. 1(a), the light source is a 532 nm single longitudinal mode solid-state laser (Changchun New Industries Optoelectronics Tech. Co., Ltd, China) with an output power of 150 mW. A continuously variable neutral density filter (NDF) produced by Daheng Optics (Part Number: GCO-0703M) with the optical density range of 0 ∼ 2.0 is placed after the laser to attenuate the light power to be around 50 mW. The output beam propagates through a half-wave plate (HWP1) and then is split into two polarized beams by a polarizing beam splitter (PBS). The two beams are noted as illumination beam (I), and reference beam (R), and their polarization directions are perpendicular to the xoz plane and the yoz plane, respectively. In the path of the illumination beam, I is first expanded by the beam expander (BE1), later goes into a cube beam splitter (BS1), and is split into two beams (I1, I2). These two beams are reflected by the mirror M2 and M3 and illuminate object with similar tilt angles. It should be noted that I1 propagates along xoz plane with horizontal polarization, and I2 propagates along yoz plane with vertical polarization. After I1 and I2 passing through the object, the microscope objective (MO, Olympus UPLFLN, 20X, NA=0.50), and the Tube lens (TL), the two object waves (O1, O2) are generated with orthogonal polarizations, as shown in Fig. 1(b1). Then, O1 and O2 are reflected by the cube beam splitter (BS2) and reach the camera plane. In the reference path, R is expanded by the beam expander (BE2), reflected by the mirror (M1), and passes through the cube beam splitter (BS2) and reaches the camera plane. A half-wave plate (HWP1) is placed in front of the PBS to adjust the intensity ratio between O1, O2. and R. Another half-wave plate (HWP2) is set in the reference arm to make R have polarization direction of 45° with respect to U and V axis, as shown in Fig. 1(b1). Finally, the interference among O1, O2, and R gives rise to a multiplexing hologram and is recorded by camera (1024×1024 pixels, 5.5µm, PointGrey, Canada). An example of a hologram, in which a polystyrene microsphere is used as a sample, is given in Fig. 1(b2). As O1 and O2 have orthogonal polarizations, they do not interfere (O1*O2=O1O2*=0), and the hologram intensity contains R*O1, RO1*, R*O2, and RO2*. So, there are two fringe patterns with different orientations in the hologram, as shown in the zoom of Fig. 1(b2). The frequency spectrum of the hologram is presented in Fig. 1(b3), in which the four terms correspond to the real and virtual images of O1 and O2, respectively.

 figure: Fig. 1.

Fig. 1. DH with orthogonal polarization two-beams off-axis illumination. (a) Schematic of the digital holographic microscopy system with two off-axis illumination beams having different polarization: NDF, neutral density filter; HWP, half-wave plate; PBS, polarizing beam splitter; BE, beam expander; BS, beam splitter; M, mirror; MO, microscope objective; TL, tube lens; R, reference beam; I, illumination beam; O, object beam. (b1) Polarization directions of the two object waves and the reference wave. (b2) A single multiplexing hologram with a dimension of 1024 × 1024 pixels. (b3) The frequency spectrum of (b2). The frequency spectrums of the object waves O1 and O2 are indicated by the white polygons and the red polygons, respectively.

Download Full Size | PDF

Our model arranged an afocal configuration for microscopic imaging, as shown in Fig. 2. Here, the back focal plane of the MO coincides with the front focal plane of the tube lens TL, and the camera locates at the back focal plane of the TL. We consider a reference system of coordinates OXYZ to describe the object space, where the plane XOY is on the front focal plane of MO and center O in the focus of MO. The reference system of coordinates O1UVW represents the image space, where the plane UO1V is on the camera plane, and center O1 is in the center of camera. Supposing that there is a point object in the object space with coordinates p(Xp, Yp, Zp), I1 and I2 illuminate the object at the angle of α and β along XOZ plane and YOZ plane, respectively. Then, O1 and O2 are produced and projected on the hologram plane at the angle of α1 and β1 along UO1W plane and VO1W plane. We denote the in-focus image of the point object as P(UP, VP, WP) and its vertical projection on the hologram plane as P´(UP, VP, 0). Moreover, the images formed by O1 and O2 are represented as P1(UP1, VP1, 0) and P2(UP2, VP2, 0), respectively.

 figure: Fig. 2.

Fig. 2. The imaging process of the proposed setup. (a) The sketch of the 4f microscopic imaging configuration. The back focal plane of the MO coincides with the front focal plane of the TL, and the camera locates at the back focal plane of the TL. (b ∼ c) The three possible situations of the position of the in-focus image of the object. The in-focus image of the object is in the hologram plane (b), the in-focus image of the object is behind the hologram plane (c), and the in-focus image of the object is before the hologram plane (d).

Download Full Size | PDF

As shown in Figs. 2(b)–2(d), the position of P relative to hologram plane has three possible situations: WP = 0 (P is in the hologram plane), WP > 0 (P is behind the hologram plane), and WP < 0 (P is before the hologram plane). In the latter two situations, P, P1, and P2 are separated. From Figs. 2(c) and 2(d), it is obvious that PP1P´ plane is perpendicular to UO1V plane and simultaneously parallel with UO1W plane. It means that the information of the longitudinal position of P is encoded in the lateral positions of P1 and P2. The length of line segment P1P2 could be expressed as

$${d_{\textrm{P1P2}}} = \sqrt {{{({{U_{\textrm{P1}}} - {U_{\textrm{P2}}}} )}^2} + {{({{V_{\textrm{P1}}} - {V_{\textrm{P2}}}} )}^2}}$$

Since ΔPP1P´, ΔPP2P´ are right-angle triangle, the length of P1P´ and P2P´ can be represented by the longitudinal position of P as following

$$\left\{ {\begin{array}{{c}} {|{{U_{\textrm{P1}}} - {U_{\textrm{P2}}}} |= |{{W_\textrm{P}}} |\times \tan \alpha 1}\\ {|{{V_{\textrm{P1}}} - {V_{\textrm{P2}}}} |= |{{W_\textrm{P}}} |\times \tan \beta 1} \end{array}} \right.$$
where |·| represents an absolute value function. Therefore, combining Eqs. (1) and (2), the distance between P and hologram plane is rewritten as
$$|{{W_\textrm{P}}} |= \frac{{{d_{\textrm{P1P2}}}}}{{\sqrt {{{({\tan \alpha 1} )}^2} + {{({\tan \beta 1} )}^2}} }}$$
As shown in Figs. 2(c) and 2(d), when WP > 0, UP1 > UP2, and WP < 0, UP1 < UP2. It means WP has the same sign as (UP1 – UP2). Therefore, the longitudinal position of P can be written as
$$\begin{aligned} {W_\textrm{P}} &= \frac{{{\mathop{\rm sgn}} ({U_{\textrm{P1}}} - {U_{\textrm{P2}}})}}{{\sqrt {{{({\tan \alpha 1} )}^2} + {{({\tan \beta 1} )}^2}} }} \times {d_{\textrm{P1P2}}}\\ &= \frac{{M \times {\mathop{\rm sgn}} ({U_{\textrm{P1}}} - {U_{\textrm{P2}}})}}{{\sqrt {{{({\tan \alpha } )}^2} + {{({\tan \beta } )}^2}} }} \times {d_{\textrm{P1P2}}} = K \times {d_{\textrm{P1P2}}} \end{aligned}$$
where K = M × sgn(UP1UP2)/{[(tanα)2+(tanβ)2]1/2}, where sgn(·) represents the sign function. As for the first case, P, P1, and P2 are overlapped on the hologram plane. Obviously, it is the special case of Figs. 2(c) and 2(d), and the longitudinal position of P could still be obtained by Eq. (4).

Based on Eq. (4), the axial position of the in-focus image of the object WP is rapidly recovered by identifying the separation distance of the two defocus images of the object, allowing fast implementation of autofocusing. Once the in-focus image of the object is reconstructed from the hologram, the lateral positions of the in-focus image of the object Up and Vp could be retrieved by using an image segmentation algorithm. The magnification of the setup is M = fTL / fMO ≈20.10, which is determined by the focal lengths of MO and TL. According to the law of lens imaging, it is known that:

$$\left\{ \begin{array}{c} X = U/M\\ Y = V/M\\ Z = W/{M^2} \end{array} \right.$$
Then the position of the object could be acquired and written as:
$$\left\{ \begin{array}{c} {X_\textrm{P}} = {U_\textrm{P}}/M\\ {Y_\textrm{P}} = {V_\textrm{P}}/M\\ {Z_\textrm{P}} = {W_\textrm{P}}/{M^2} \end{array} \right.$$

As described above, the focus distance of the object could be determined rapidly by identifying the separation distance of the two defocus images in the hologram plane, allowing fast autofocusing. Moreover, the lateral positions of the in-focus image could be retrieved by image segmentation. In practical application, the two defocus images can be obtained by reconstructing the amplitude or phase distributions numerically. Then, the separation distance dP1P2 of the two images is determined by performing a cross-correlation analysis. After that, based on Eq. (4), the axial position WP is rapidly recovered. Next, the UP and VP could be retrieved by calculating the centroid coordinates of the in-focus image reconstructed from hologram with an image segmentation algorithm. Furthermore, by applying Eq. (6), the position of the object could be obtained, enabling fast 3D tracking. The vital step of the proposed scheme is to determine the linear relation expressed in Eq. (4). Here, we estimate this relation by a hologram where the focus distance of one of the objects in the FOV is known. This hologram could be captured by moving the object to the focus plane of the microscope objective. In this process, the hologram is first reconstructed at a series of distances. Then, the separation distance of the two defocus images dP1P2 at every reconstructed distance is calculated by performing correlation analysis. Thus, K is attained by fitting the linear relation of the separation distances dP1P2 and reconstructed distances WP. Once the illumination angles and the magnification of the setup are fixed, such calibrated value of K could be used for all experiments. Subsequently, accurate autofocusing and dynamic tracking can be implemented without any knowledge about illumination angles and free from the accuracy limit caused by poor axial resolution. According to Eq. (4), the minimum and the range of defocus that can be estimated are influenced by the illumination angles. Large illumination angles imply a small minimum and a short range, while small illumination angles define a long minimum and a wide range. Moreover, large illumination angles risk to decrease imaging quality due to the much loss of low-frequency information. However, large illumination angles could increase the separation of the interference terms in the spatial frequency domain. Therefore, there is a compromise in setting illumination angles to balance sensitivity, range, imaging quality and the separation of interference terms. Since these two illumination beams go through the same path, they should have the same optimal angle.

3. Experiments

First, we measure the relationship between the focus distance and the separation distance of the two defocus images by recording a single in-focus hologram of a 1951 USAF resolution target based on the experimental setup shown in Fig. 1. The detailed process is sketched in Fig. 3. First, the object waves O1 and O2 are extracted from the multiplexing hologram using frequency filtering method, as shown in Fig. 3(a). After removing the tilt aberration due to the off-axis recording, the two object waves are numerically propagated with a series of distances. Thus, a sequence of the amplitude images is reconstructed at different defocus distances, as shown in Fig. 3(b). It can be seen that the twin images are separated when the defocus occurs and superimposed when the images are focused. Then, the separation distances of the two defocus images are determined by cross-correlation analysis, as shown in the third row of Fig. 3(b). The cross-correlation analysis is implemented in two steps. The first step furnishes the correlation coefficient image by calculating the cross-correlation between the two defocus images by means of a fast Fourier transform [41]. In the second step, the peak’s position of correlation coefficient image should be located. Then the shift of this position with respect to the center of FOV defines the separation distances of the two defocus images. Further, the slope K of Eq. (4) is obtained by implementing linear fitting with the pair of the separation distance and focus distance, as shown in Fig. 3(c). It can be seen that these discrete points are perfectly following the line with the slope of K.

 figure: Fig. 3.

Fig. 3. The measured relationship between the focus distance of the object and the separation distance of the two defocus images of the object by a single hologram. (a) Extracting the two object waves from the multiplexing hologram. Two binary masks are utilized to separate the two object waves using frequency filtering method and the tilt aberration introduced by the off-axis reference wave is removed by shifting the numerical aperture of the system to the center of the frequency spectrum image. (b) Finding the pair of the separation distance of the two defocus images of the object and defocus distance of the object by numerical propagation and cross-correlation analysis. (c) Linear relationship between the focus distance of the object and the separation distance of the two defocus images of the object.

Download Full Size | PDF

To quantify the localization accuracy of the proposed method, we moved the 1951 USAF resolution target along the optical axis using a stepping motor whose moving accuracy is around 1 µm. The resolution target moved in the range of −100 µm ∼ 100 µm at a step of 5 µm, which is repeated 7 times. Then, the axial positions of the resolution target are recovered from the holograms by the proposed method and compared with that provided by the stepping motor. The localization error is defined by subtracting the axial position estimated by the proposed method from the axial position provided by the stepping motor. The results are shown in Fig. 4. As can be seen, most of the localization errors (253/287 ≈ 88.15%) are distributed in the range of −1.5 µm ∼ 1.5 µm, which are lower than the depth of field of the system 2λ/NA2 ≈ 4.25 µm. The results demonstrate that the localization accuracy of the proposed method beyond the limit of poor axial resolution.

 figure: Fig. 4.

Fig. 4. Localization error distributions in moving a 1951 USAF resolution target along the optical axis in the range of −100 µm ∼ 100 µm at a step of 5 µm, which is repeated 7 times. The localization error is defined by subtracting the axial position estimated by the proposed method from the axial position provided by the stepping motor.

Download Full Size | PDF

Finally, the proposed method is verified by tracking the sediment processes of polystyrene microspheres with a diameter of 20 µm. The microspheres are placed in a 35 mm confocal dish and diluted with water. Then a pipette is used to stir the solution so that the microspheres would first up-float and then sediment. In the sediment process, a series of holograms is continuously captured, and three holograms at the time of 0, 6.6, and 9.0 seconds are shown in Fig. 5(a1), (b1) and (c1). Figures 5(a2), 5(a3), 5(b2), 5(b3), and 5(c2), 5(c3) present the corresponding amplitude image of O1 and O2 reconstructed in the hologram plane. It can be seen that these images are defocused, and the separation distance of the twin images is changing during the sediment process. A sub-region that contains the tracking microsphere is first cropped from the defocus amplitude image of O1. The distance d is then determined by performing the cross-correlation analysis between the cropping sub-region and the defocus amplitude image of O2, as shown in Figs. 5(a4), 5(b4), 5(c4). The method proposed by J. P. Lewis et al. [42] is utilized to obtain the correlation coefficient image instead of a fast Fourier transform due to the unequal image sizes and requires more computation time. In this case, the peak’s position of correlation coefficient image indicates the position of the sub-region in the defocus amplitude image of O2 that is matched with the cropping sub-region from the defocus amplitude image of O1 so that the shift of this position with respect to the position of cropping sub-region defines the separation distance. Thus, the focus distances W are obtained by using the linear relationship shown in Fig. 3(c). By using MATLAB R2019b on a 2.40 GHz laptop, the refocusing of the tracking microsphere spent 0.725s where the sizes of cropping sub-region and the defocus amplitude image of O2 are 256×256 pixels and 1024×1024 pixels, respectively. Further, the in-focus amplitude image A and phase image φ of O1 and O2 are reconstructed by numerically diffraction propagation with the focus distance, respectively, as shown in Figs. 5(a5)–5(a8), (b5∼b8), and (c5∼c8). Then, the lateral position of the microsphere could be retrieved from these images by image segmentation, and the axial position is obtained according to the focus distance and the magnification of the imaging system. As can be seen in Figs. 5(b4) and 5(c4), there are two peaks in the correlation coefficient image, which indicates that two microspheres in the defocus amplitude image of O2 show high similarity with the tracking microsphere. It means that multiple peaks may exist in the correlation coefficient image when the sample is extremely homogenous such as microsphere and multiple samples are distributed in the FOV. In this case, the location of the maximum peak should be detected to calculate the distance d. However, multiple such samples with the same defocus distance would form multiple maximum peaks in the correlation coefficient image, making it hard to find the distance d.

 figure: Fig. 5.

Fig. 5. 3D tracking of multiple sediment microspheres. (a1, b1, and c1) The holograms recorded at the time of 0, 6.6, and 9.0 seconds. (a2, a3), (b2, b3) and (c2, c3) The amplitude images of O1 and O2 reconstructed from (a1), (b1) and (c1) in the hologram plane. (a4), (b4), and (c4) The correlation coefficient images calculated from (a3) and the cropping sub-region of (a2), (b3) and the cropping sub-region of (b2), (c3) and the cropping sub-region of (c2). (a5∼a8), (b5∼b8) and (c5∼c8) The reconstructed in-focus amplitude images and phase images of O1 and O2 from (a1), (b1) and (c1).

Download Full Size | PDF

Further, the proposed method is applied to track the free-swimming tetraselmis subcordiformis. The tetraselmis subcordiformis swim in the 35 mm confocal dish freely. We record a sequence of multiplexing holograms of etraselmis subcordiformis. In the FOV, some keep static while others are moving, presenting various states. Three moving tetraselmis subcordiformis cells are tracked, and the corresponding 3D paths are retrieved using the proposed method. The multiplexing holograms of these three tracked cells are shown in Figs. 6(a), 6(d), and 6(g), where the two defocus images of cells are indicated by the white dashed rectangle. Their corresponding in-focus amplitude and phase images are shown in Figs. 6(b), 6(e), and 6(h), respectively. Computation time costs 0.371s to find the focus distance for the tracking alga cell where the sizes of cropping subregion and the defocus amplitude image of O2 are 128×128 pixels and 1024×1024 pixels, respectively. Moreover, the movement trajectories of these three cells are shown in Figs. 6(c), 6(f), and 6(i). As can be seen, the proposed method could estimate the position of tetraselmis subcordiformis cell accurately.

 figure: Fig. 6.

Fig. 6. 3D tracking of tetraselmis subcordiformis. (a, d, and g) The holograms recorded at the time of 0, 2, and 5 minutes. (b), (e) and (h) The in-focus amplitude images and phase images of Cell A, Cell B and Cell C obtained from (a), (d) and (g). (c), (f) and (i) The 3D paths of Cell A, Cell B and Cell C.

Download Full Size | PDF

4. Dynamic cell morphology analysis under micro-vibration excitation

In order to verify the intrinsic advantage of the proposed method in a dynamic analysis that enables instant focusing, the experiment of monitoring the morphologic changes of living ovarian cancer cells under micro-vibration excitation is demonstrated by setting up a micro-vibration excitation module (shown in Fig. 7) in the experimental configuration shown in Fig. 1. As depicted in Fig. 7, the ovarian cancer cells are cultured in a 35 mm confocal dish, which is then installed on a piezo through a homemade metal plate. The piezo connects with the piezo controller that is linked with a signal generator. The piezo would move the confocal dish up and down along the optical axis to produce a micro-vibration excitation. The signal generator controls the amplitude and frequency of the micro-vibration.

 figure: Fig. 7.

Fig. 7. Micro-vibration excitation module. The ovarian cancer cells cultured in a 35 mm confocal dish is installed on a piezo through a homemade metal plate. The piezo moves the confocal dish up and down along the optical axis to produce a micro-vibration excitation under the control of the piezo controller and the signal generator.

Download Full Size | PDF

In this measurement, the holograms are recorded for 10 minutes with 88 fps. The signal generator output a sinusoidal signal with an amplitude of 10 V and a frequency of 5 Hz. This sinusoidal signal is amplified by the piezo controller and then drives the piezo move up and down at a range of −5∼5 µm. Therefore, the ovarian cancer cells are exposed to a micro-vibration with an amplitude of 10 µm and a frequency of 5 Hz. In this way, the ovarian cancer cells moved in a linear path, which could be tracked by the proposed method instantly, helping to find the image plane rapidly. The defocus phase distributions of O1 and O2 in the hologram plane are shown in Fig. 8(b) and 8(c), respectively. Figures 8(a) and 8(d) are the regions indicated by the red squares in Figs. 8(b) and 8(c), showing slight blurry. The focus distance could be obtained by using the proposed method. The in-focus phase distributions of O1 and O2 in the image plane are presented in Figs. 8(f) and 8(g). Figures 8(e) and 8(h) are the regions of Figs. 8(f) and 8(g) indicated by the yellow squares. Compared with Figs. 8(a) and 8(d), the edges of cells in Figs. 8(e) and 8(h) are enhanced, proving the effectiveness of the proposed method. Figure 8(i) shows the longitudinal positions of the ovarian cancer cells in the vibrating process, showing great agreement with the signal of the generator. Computation time consumes 0.317s to refocus the ovarian cells where the sizes of the defocus amplitude image of O1 and O2 are both 1024×1024 pixels.

 figure: Fig. 8.

Fig. 8. Quantitative phase imaging of ovarian cancer cells under micro-vibration excitation. (b and c) The defocus phase distributions of O1 and O2 in the hologram plane. (a and d) The regions indicated by the red squares in (b and c). (f and g) The in-focus phase distributions of O1 and O2. (e and h) are the regions indicated by the yellow squares in (f and g). (i) The longitudinal positions of the ovarian cancer cells in the vibrating process.

Download Full Size | PDF

Figure 9 shows the quantitative phase images of ovarian cancer cells before and after vibration excitation. The phase image before vibrating at the time of 0 min is shown in Fig. 9(c) and Figs. 9(a) and 9(e) are the zooms of cell A and cell B indicated by squares in Fig. 9(c). The image after vibrating for 10 min is shown in Fig. 9(d), and Figs. 9(b) and 9(f) are the zooms of cell A and cell B indicated by squares in Fig. 9(d). Changes of the morphology of ovarian cancer cells are evident through comparing Figs. 9(a) and 9(e) with Figs. 9(b) and 9(f). The change could also be compared quantitatively by calculating the phase volumes of the cells. Here, the phase volume of ovarian cancer cell is determined by averaging the phase volumes calculated from the phase maps of O1 and O2 to alleviate the influence of noise. The phase volumes of cell A and B are 1211089.15 rad×µm2, 875252.29 rad×µm2 at the time of 0 min while they change to be 1269766.44 rad×µm2, 882781.52 rad×µm2 at the time of 10 min.

 figure: Fig. 9.

Fig. 9. Qualitative description of the morphology change of ovarian cancer cells introduced by micro-vibration exaction. (c) The phase image before vibrating at the time of 0 minute. (a and e) the zoom of cell A and cell B indicated by squares in (c). (d) The phase image after vibrating for 10 minutes. (b and f) The zoom of cell A and cell B indicated by squares in (d).

Download Full Size | PDF

5. Conclusion

We show that a polarization multiplexing DH microscope intrinsically allows fast autofocusing and dynamic tracking. In fact, by illuminating the object using two plane beams polarized along the horizontal and vertical direction, we demonstrated rapid and accurate autofocusing and dynamic tracking as occurs in all polarization-sensitive DH system. By using the polarization multiplexing technique, the two defocus images of the object can be simultaneously recorded in a multiplexing hologram using a laser source without spectral aliasing. Due to the two off-axis plane wave illuminations, the focus distance of the object and the separation distance of the two defocus images of the object follow a fixed linear relationship. Therefore, the focus distance of the object can be instantly and accurately inferred by identifying the separation distance of the two defocus images. Moreover, thanks to the two orthogonal plane illuminations, the defocusing distance sign could be recognized from the transverse coordinates of the two defocused images. We adopted the proposed approach for quantitatively assessing the morphological changes of ovarian cancer cells under micro-vibration stimulation, tracking the sediment process of micro-sphere, and tracing the swimming of tetraselmis. Thus, the experimental results validate the proposed approach’s rapidness and effectiveness and demonstrate that it could accurately recover the defocusing distance and 3D coordinates of the object without sequential hologram reconstruction, model training, precise alignment. It makes DH microscope a reliable tool for the monitoring of dynamic processes of cells.

Funding

National Natural Science Foundation of China (61775010); Natural Science Foundation of Beijing Municipality (7192104).

Acknowledgments

The authors thank the Freshwater Algae Culture Collection at the Institute of Hydrobiology for the support in providing tetraselmis specimens. The authors also thank the Doctor Xiaoping Li from Department of Obstetrics and Gynecology, Peking University People's Hospital for the great help in the preparation of ovarian cancer cells.

Disclosures

The authors declare that there are no conflicts of interest related to this article.

References

1. T. Colomb, F. Dürr, E. Cuche, P. Marquet, H. G. Limberger, R. P. Salathé, and C. Depeursinge, “Polarization microscopy by use of digital holography: Application to optical-fiber birefringence measurements,” Appl. Opt. 44(21), 4461–4469 (2005). [CrossRef]  

2. M. A. Ferrara, A. De Angelis, G. Coppola, A. C. De Luca, and G. Coppola, “Polarization digital holography for birefringence characterization of sperm cells,” in IET Conference Publications (2018), 2018 (CP748).

3. J. Wang, L. Dong, H. Chen, and S. Huang, “Birefringence measurement of biological tissue based on polarization-sensitive digital holographic microscopy,” Appl. Phys. B: Lasers Opt. 124(12), 240 (2018). [CrossRef]  

4. M. M. Sreelal, R. V. Vinu, and R. K. Singh, “Jones matrix microscopy from a single-shot intensity measurement,” Opt. Lett. 42(24), 5194 (2017). [CrossRef]  

5. Y. Kim, J. Jeong, J. Jang, M. W. Kim, and Y. Park, “Polarization holographic microscopy for extracting spatio-temporally resolved Jones matrix,” Opt. Express 20(9), 9948 (2012). [CrossRef]  

6. T. D. Yang, K. Park, Y. G. Kang, K. J. Lee, B.-M. Kim, and Y. Choi, “Single-shot digital holographic microscopy for quantifying a spatially-resolved Jones matrix of biological specimens,” Opt. Express 24(25), 29302 (2016). [CrossRef]  

7. T. Kobata and T. Nomura, “Digital holographic three-dimensional Mueller matrix imaging,” Appl. Opt. 54(17), 5591 (2015). [CrossRef]  

8. T. Nomura and B. Javidi, “Object recognition by use of polarimetric phase-shifting digital holography,” Opt. Lett. 32(15), 2146 (2007). [CrossRef]  

9. L. Yu and M. K. Kim, “Wavelength-scanning digital interference holography for tomographic three-dimensional imaging by use of the angular spectrum method,” Opt. Lett. 30(16), 2092–2094 (2005). [CrossRef]  

10. P. Langehanenberg, B. Kemper, and G. von Bally, “Autofocus algorithms for digital-holographic microscopy,” in European Conference on Biomedical Optics (2007), p. 6633.

11. P. Memmolo, L. Miccio, M. Paturzo, G. Di Caprio, G. Coppola, P. A. Netti, and P. Ferraro, “Recent advances in holographic 3D particle tracking,” Adv. Opt. Photonics 7(4), 713–755 (2015). [CrossRef]  

12. X. Yu, J. Hong, C. Liu, M. Cross, D. T. Haynie, and M. K. Kim, “Four-dimensional motility tracking of biological cells by digital holographic microscopy,” J. Biomed. Opt. 19(4), DM4A (2014). [CrossRef]  

13. M. DaneshPanah and B. Javidi, “Tracking biological microorganisms in sequence of 3D holographic microscopy images,” Opt. Express 15(17), 10761–10766 (2007). [CrossRef]  

14. I. Moon, M. Daneshpanah, B. Javidi, and A. Stern, “Automated three-dimensional identification and tracking of micro/nanobiological organisms by computational holographic microscopy,” Proc. IEEE 97(6), 990–1010 (2009). [CrossRef]  

15. J. Gillespie and R. A. King, “The use of self-entropy as a focus measure in digital holography,” Pattern Recognit. Lett. 9(1), 19–25 (1989). [CrossRef]  

16. F. Dubois, C. Schockaert, N. Callens, and C. Yourassowsky, “Focus plane detection criteria in digital holography microscopy by amplitude analysis,” Opt. Express 14(13), 5895–5908 (2006). [CrossRef]  

17. M. Antkowiak, N. Callens, C. Yourassowsky, and F. Dubois, “Extended focused imaging of a microparticle field with digital holographic microscopy,” Opt. Lett. 33(14), 1626–1628 (2008). [CrossRef]  

18. L. Ma, H. Wang, Y. Li, and H. Jin, “Numerical reconstruction of digital holograms for three-dimensional shape measurement,” J. Opt. A: Pure Appl. Opt. 6(4), 396–400 (2004). [CrossRef]  

19. S. Lee, J. Y. Lee, W. Yang, and D. Y. Kim, “Autofocusing and edge detection schemes in cell volume measurements with quantitative phase microscopy,” Opt. Express 17(8), 6476–6486 (2009). [CrossRef]  

20. T. Yatagai and M. L. Tachiki, “Simultaneous depth determination of multiple objects by focus analysis in digital holography,” in Optics InfoBase Conference Papers (2008).

21. M. Lyu, C. Yuan, D. Li, and G. Situ, “Fast autofocusing in digital holography using the magnitude differential,” Appl. Opt. 56(13), F152–F157 (2017). [CrossRef]  

22. P. Langehanenberg, G. von Bally, and B. Kemper, “Autofocusing in digital holographic microscopy,” 3D Res 2(1), 4 (2011). [CrossRef]  

23. P. Memmolo, C. Distante, M. Paturzo, and A. Finizio, “Automatic focusing in digital holography and its application to stretched holograms,” Opt. Lett. 36(10), 1945–1947 (2011). [CrossRef]  

24. P. Memmolo, M. Paturzo, B. Javidi, P. A. Netti, and P. Ferraro, “Refocusing criterion via sparsity measurements in digital holography,” Opt. Lett. 39(16), 4719 (2014). [CrossRef]  

25. Y. Zhang, H. Wang, Y. Wu, M. Tamamitsu, and A. Ozcan, “Edge sparsity criterion for robust holographic autofocusing,” Opt. Lett. 42(19), 3824–3827 (2017). [CrossRef]  

26. W. Yingchun, W. Xuecheng, Y. Jing, W. Zhihua, G. Xiang, Z. Binwu, C. Linghong, Q. Kunzan, G. Gréhan, and C. Kefa, “Wavelet-based depth-of-field extension, accurate autofocusing, and particle pairing for digital inline particle holography,” Appl. Opt. 53(4), 556–564 (2014). [CrossRef]  

27. M. Liebling and M. Unser, “Autofocus for digital Fresnel holograms by use of a Fresnelet-sparsity criterion,” J. Opt. Soc. Am. A 21(12), 2424–2430 (2004). [CrossRef]  

28. X. Fan, J. J. Healy, Y. Guanshen, and B. M. Hennelly, “Sparsity metrics for autofocus in digital holographic microscopy,” in Optics, Photonics and Digital Technologies for Imaging Applications IV (2016), 9896, p. 989619.

29. T. Pitkäaho, A. Manninen, and T. J. Naughton, “Focus classification in digital holographic microscopy using deep convolutional neural networks,” in Optics InfoBase Conference Papers (2017), Part F61-E.

30. Z. Ren, Z. Xu, and E. Y. Lam, “Autofocusing in digital holography using deep learning,” in Three-Dimensional and Multidimensional Microscopy: Image Acquisition and Processing XXV (2018), 10499, p. 104991–V.

31. Z. Ren, Z. Xu, and E. Y. Lam, “Learning-based nonparametric autofocusing for digital holography,” Optica 5(4), 337 (2018). [CrossRef]  

32. J. Lee, W. Jeong, K. Son, W. Jeon, and H. Yang, “Autofocusing using deep learning in off-axis digital holography,” in Digital Holography and Three-Dimensional Imaging (2018), p. DTh1C–4.

33. T. Shimobaba, T. Kakue, and T. Ito, “Convolutional neural network-based regression for depth prediction in digital holography,” in 2018 IEEE 27th International Symposium on Industrial Electronics (ISIE) (2018), pp. 1323–1326.

34. F. Saglimbeni, S. Bianchi, A. Lepore, and R. Di Leonardo, “Three-axis digital holographic microscopy for high speed volumetric imaging,” Opt. Express 22(11), 13710 (2014). [CrossRef]  

35. J. Öhman, P. Gren, and M. Sjödahl, “Polarization-resolved dual-view holographic system for 3D inspection of scattering particles,” Appl. Opt. 58(34), G31 (2019). [CrossRef]  

36. P. Gao, B. Yao, J. Min, R. Guo, B. Ma, J. Zheng, M. Lei, S. Yan, D. Dan, and T. Ye, “Autofocusing of digital holographic microscopy based on off-axis illuminations,” Opt. Lett. 37(17), 3630–3632 (2012). [CrossRef]  

37. J. Zheng, P. Gao, and X. Shao, “Opposite-view digital holographic microscopy with autofocusing capability,” Sci. Rep. 7(1), 1–9 (2017). [CrossRef]  

38. T.-W. Su, L. Xue, and A. Ozcan, “High-throughput lensfree 3D tracking of human sperms reveals rare statistics of helical trajectories,” Proc. Natl. Acad. Sci. 109(40), 16018–16022 (2012). [CrossRef]  

39. P. Memmolo, A. Finizio, M. Paturzo, L. Miccio, and P. Ferraro, “Twin-beams digital holography for 3D tracking and quantitative phase-contrast microscopy in microfluidics,” Opt. Express 19(25), 25833–25842 (2011). [CrossRef]  

40. J. Burge and W. S. Geisler, “Optimal defocus estimates from individual images for autofocusing a digital camera,” in Digital Photography VIII (2012), 8299, p. 82990E.

41. M. Guizar-Sicairos, S. T. Thurman, and J. R. Fienup, “Efficient subpixel image registration algorithms,” Opt. Lett. 33(2), 156 (2008). [CrossRef]  

42. J. Y. Tae and H. Han, “Fast Normalized Cross-Correlation,” Cir. Syst. Sign. Process. 28, 819–843 (2009). [CrossRef]  

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (9)

Fig. 1.
Fig. 1. DH with orthogonal polarization two-beams off-axis illumination. (a) Schematic of the digital holographic microscopy system with two off-axis illumination beams having different polarization: NDF, neutral density filter; HWP, half-wave plate; PBS, polarizing beam splitter; BE, beam expander; BS, beam splitter; M, mirror; MO, microscope objective; TL, tube lens; R, reference beam; I, illumination beam; O, object beam. (b1) Polarization directions of the two object waves and the reference wave. (b2) A single multiplexing hologram with a dimension of 1024 × 1024 pixels. (b3) The frequency spectrum of (b2). The frequency spectrums of the object waves O1 and O2 are indicated by the white polygons and the red polygons, respectively.
Fig. 2.
Fig. 2. The imaging process of the proposed setup. (a) The sketch of the 4f microscopic imaging configuration. The back focal plane of the MO coincides with the front focal plane of the TL, and the camera locates at the back focal plane of the TL. (b ∼ c) The three possible situations of the position of the in-focus image of the object. The in-focus image of the object is in the hologram plane (b), the in-focus image of the object is behind the hologram plane (c), and the in-focus image of the object is before the hologram plane (d).
Fig. 3.
Fig. 3. The measured relationship between the focus distance of the object and the separation distance of the two defocus images of the object by a single hologram. (a) Extracting the two object waves from the multiplexing hologram. Two binary masks are utilized to separate the two object waves using frequency filtering method and the tilt aberration introduced by the off-axis reference wave is removed by shifting the numerical aperture of the system to the center of the frequency spectrum image. (b) Finding the pair of the separation distance of the two defocus images of the object and defocus distance of the object by numerical propagation and cross-correlation analysis. (c) Linear relationship between the focus distance of the object and the separation distance of the two defocus images of the object.
Fig. 4.
Fig. 4. Localization error distributions in moving a 1951 USAF resolution target along the optical axis in the range of −100 µm ∼ 100 µm at a step of 5 µm, which is repeated 7 times. The localization error is defined by subtracting the axial position estimated by the proposed method from the axial position provided by the stepping motor.
Fig. 5.
Fig. 5. 3D tracking of multiple sediment microspheres. (a1, b1, and c1) The holograms recorded at the time of 0, 6.6, and 9.0 seconds. (a2, a3), (b2, b3) and (c2, c3) The amplitude images of O1 and O2 reconstructed from (a1), (b1) and (c1) in the hologram plane. (a4), (b4), and (c4) The correlation coefficient images calculated from (a3) and the cropping sub-region of (a2), (b3) and the cropping sub-region of (b2), (c3) and the cropping sub-region of (c2). (a5∼a8), (b5∼b8) and (c5∼c8) The reconstructed in-focus amplitude images and phase images of O1 and O2 from (a1), (b1) and (c1).
Fig. 6.
Fig. 6. 3D tracking of tetraselmis subcordiformis. (a, d, and g) The holograms recorded at the time of 0, 2, and 5 minutes. (b), (e) and (h) The in-focus amplitude images and phase images of Cell A, Cell B and Cell C obtained from (a), (d) and (g). (c), (f) and (i) The 3D paths of Cell A, Cell B and Cell C.
Fig. 7.
Fig. 7. Micro-vibration excitation module. The ovarian cancer cells cultured in a 35 mm confocal dish is installed on a piezo through a homemade metal plate. The piezo moves the confocal dish up and down along the optical axis to produce a micro-vibration excitation under the control of the piezo controller and the signal generator.
Fig. 8.
Fig. 8. Quantitative phase imaging of ovarian cancer cells under micro-vibration excitation. (b and c) The defocus phase distributions of O1 and O2 in the hologram plane. (a and d) The regions indicated by the red squares in (b and c). (f and g) The in-focus phase distributions of O1 and O2. (e and h) are the regions indicated by the yellow squares in (f and g). (i) The longitudinal positions of the ovarian cancer cells in the vibrating process.
Fig. 9.
Fig. 9. Qualitative description of the morphology change of ovarian cancer cells introduced by micro-vibration exaction. (c) The phase image before vibrating at the time of 0 minute. (a and e) the zoom of cell A and cell B indicated by squares in (c). (d) The phase image after vibrating for 10 minutes. (b and f) The zoom of cell A and cell B indicated by squares in (d).

Equations (6)

Equations on this page are rendered with MathJax. Learn more.

d P1P2 = ( U P1 U P2 ) 2 + ( V P1 V P2 ) 2
{ | U P1 U P2 | = | W P | × tan α 1 | V P1 V P2 | = | W P | × tan β 1
| W P | = d P1P2 ( tan α 1 ) 2 + ( tan β 1 ) 2
W P = sgn ( U P1 U P2 ) ( tan α 1 ) 2 + ( tan β 1 ) 2 × d P1P2 = M × sgn ( U P1 U P2 ) ( tan α ) 2 + ( tan β ) 2 × d P1P2 = K × d P1P2
{ X = U / M Y = V / M Z = W / M 2
{ X P = U P / M Y P = V P / M Z P = W P / M 2
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.