Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Lensless phase microscopy and diffraction tomography with multi-angle and multi-wavelength illuminations using a LED matrix

Open Access Open Access

Abstract

We demonstrate lensless quantitative phase microscopy and diffraction tomography based on a compact on-chip platform, using only a CMOS image sensor and a programmable color LED matrix. Based on the multi-wavelength phase retrieval and multi-angle illumination diffraction tomography, this platform offers high quality, depth resolved images with a lateral resolution of 3.72μm and an axial resolution of 5μm, across a wide field-of-view of 24mm2. We experimentally demonstrate the success of our method by imaging cheek cells, micro-beads, and fertilized eggs of Parascaris equorum. Such high-throughput and miniaturized imaging device can provide a cost-effective tool for telemedicine applications and point-of-care diagnostics in resource-limited environments.

© 2015 Optical Society of America

1. Introduction

Quantitative phase imaging (QPI) has become an invaluable tool for bio-medical research thanks to its unique capabilities to image living specimens without the need for specific staining or labelling [1]. With the introduction of the interferometry and digital holography methods to microscopy, it became possible to measure the phase shift introduced by a specimen quantitatively [2, 3]. Nevertheless, conventional interferometric methods typically rely on highly coherent laser illumination, plaguing with speckle noise that prevents the formation of high quality images. Furthermore, the measured phase only represents the object projection along the axial direction, making detailed volumetric information inside the sample inaccessible. Finally, the imaging optics and the bulky, expensive, and highly vibration-sensitive interferometric deceive become the major obstacle to making the whole system in a miniaturized and cost-effective format.

During recent years, great efforts have been made by large number of researchers to overcome the above-mentioned limitations, and the developments can be accordingly categorized as follows. (1) Developing new quantitative phase measurement approaches that use partially coherent illumination or even white-light: Recently, some common-path interferometric QPI methods have been reported for various biomedical applications, which allow for self-interference under white-light illumination. Such approaches include spatial light interference microscopy (SLIM) [4], white-light diffraction phase microcopy (wDPM) [5], white-light Fourier phase microscopy (wFPM) [6] etc. These white-light QPI methods significantly alleviates the coherent noise problem and, meanwhile, their common-path geometries are inherently insensitive to mechanical vibrations and air fluctuations that typically affect any interferometric system. On a different note, the non-interferometric phase retrieval techniques have been extensively explored during recently years since they also relax the stringent beam coherence requirements compared with the conventional interferometry and holography. Such methods comprise iterative phase retrieval algorithms [79], and deterministic approaches based on solving the transport of intensity equation (TIE) [1013]. Without the need of a separate reference beam, these non-interferometric single-beam approaches require only object field intensity measurements at different axial planes [9, 11], or with different wavelengths [8, 12, 14] for quantitative phase reconstruction. It has been demonstrated that those non-interferometric phase retrieval methods can be well applied to partially coherent fields [9, 15, 16], enabling accurate and high-quality quantitative phase imaging and preventing image degradation due to speckle noise. (2) Combining QPI with tomography techniques to gain additional resolving power in the axial direction: The common idea is to record multiple complex amplitude images diffracted by the object under varying illumination angles (using either specimen rotation [17] or beam scanning [1820]) or by simply scanning the focus through the sample [21], and then to reconstruct 3D structure with filtered back-projection/inverse Radon transform adapted from X-ray computed tomography [17, 18], or under the framework of the diffraction tomography theory that explicitly takes the effect of diffraction into account [1922]. (3) Simplifying and shrinking the size of the whole imaging system based on lensless and/or on-chip architectures [2325]. The compact, low-cost, robust, portable devices with a decent imaging performance may provide a solution for telemedicine needs, or for reducing health care costs for point-of-care diagnostics in resource-limited environments.

In this paper, we present lensless quantitative phase microscopy and diffraction tomography based on a compact on-chip platform, using only a CMOS image sensor and a programmable color LED matrix. By illuminating the sample at multiple wavelengths emit from the color LED sources, the quantitative phase of the specimen can be retrieved through an efficient phase retrieval algorithm by combining the TIE with the iterative phase retrieval approach. With the quantitative phase retrieved, the diffracted object field can be digitally refocused through back propagation without using of any lenses. Furthermore, by illuminating the sample sequentially with different LEDs across the full array, the complex fields of same object from different illumination angles can be mapped in 3D Fourier space, and the diffraction tomogram of the sample is then reconstructed by using 3D inverse Fourier transform. We wish to emphasize that the advantages of using color LED matrix for lensless phase microscopy and tomography are considerable. First, it fully eliminates the complicated and time-consuming mechanical operations that usually required as in conventional TIE phase retrieval and tomographic microscopy. By changing the color of the LED illuminations (red, green, and blue), we can capture diffraction patterns at multiple propagation distances that required in TIE phase retrieval without moving the camera or the specimen. Meanwhile, by turning on different LEDs in the matrix sequentially, we can in fact change the relative angle of illumination with respect to the specimen without rotating the sample or using any complex beam scanning devices. Second, compared with coherent laser-based imaging, the partially coherent multi-angle LED illumination at different wavelengths effectively suppress the speckle noise and distortions caused by parasitic reflections in the optical setup, resulting in high-quality, depth-resolved phase reconstructions. Finally, the simple and lensless optical configuration for optical microscopy and tomography allows itself to be a compact, cost-effective, and portable architecture for diagnostic imaging applications at the point-of-care.

It should also be emphasized here that we are not the first to apply the LED matrix as the light source for optical microscopy. The recently developed Fourier ptychographic microscopy replaces the optical condenser of a conventional microscope with a LED array [2628]. The LED array provides angularly varying illuminations to bypass the resolution limit defined by the objective lens through a ptychography-like algorithm that combining the synthetic aperture with the phase retrieval in the spatial frequency domain. Though both the high transverse resolution and a very large field of view (FOV) can be achieved simultaneously, to our knowledge, the Fourier ptychographic microscopy still requires normal imaging optics and cannot be applied to the lensless geometry. Therefore, it has to be built based on a traditional microscope equipped with objectives. However, in this work, we first present the use of a color LED matrix to achieve lensless quantitative phase imaging and diffraction tomography, which leads to a compact, standalone platform that achieves high quality, depth resolved images with a lateral resolution of 3.72 μm and an axial resolution of 5 μm, across a wide FOV of 24 mm2. The performance of our method is quantified and demonstrated using micro-beads of different sizes, as well as imaging cheek cells and fertilized eggs of Parascaris equorum.

2. System setup

Figure 1 shows a schematic diagram and a photograph of our microscope. It does not contain any moving parts, and the two essential components are a CMOS sensor (Micron MT9P031, pixel size: 2.2μm, 5M pixels) and a commercially available 8 by 8 of color LED matrix. Each LED can provide approximately spatially coherent quasi-monochromatic illuminations with narrow bandwidth [623nm (red), 522nm (green) and 467nm (blue), 20nm bandwidth, ~150μm size]. In our platform, The 64 LEDs cover about ±45° range of illumination angles in both x and y axis and controlled by an Arduino micro-controller (MCU). The LED matrix is placed at 3.25cm away from the sample plane while the samples are placed very close to the sensor chip (typically less than 2mm). The small object-to-sensor distance not only minimizes the spatial incoherence gating effect that required for high-resolution phase imaging, but also guarantees a more accurate axial intensity derivative estimation that is curial to the success of the phase retrieval based on TIE, as will be detailed below.

 figure: Fig. 1

Fig. 1 Lensfree microscopy and tomography platform. (a) Schematics explaining the principle of lensless imaging. Typical values: L = 32.5mm, z = 300μm~1.6mm. (b) Photograph of the microscope. The system is consisting of a CMOS imaging sensor and a LED matrix controlled by a MCU, where each LED can provide RGB narrow-band illumination. The whole device is powered through the USB connection.

Download Full Size | PDF

3. Phase retrieval and digital refocusing with multi-wavelength illuminations

In lensless imaging, only the out-of-focus diffraction pattern of an object can be recorded. An image of the object must then be reconstructed numerically; this can be performed by field propagation methods if the phase of the diffraction pattern can be retrieved. In the paraxial regime, wave propagation is mathematically described by the Fresnel diffraction integral

Uz(x,y)=exp(jkz)jλzU0(x,y)exp{jπλz[(xx0)2+(yy0)2]}dx0dy0,
while the intensity propagation obeys the TIE [10], which can be derived from the square modulus of the Fresnel diffraction integral
kI(x,y)z=[I(x,y)ϕ(x,y)],
where k is the wave number 2π/λ, ∇ is the transverse gradient operator. A close inspection on the diffraction integral [Eq. (1)] reveals that the wavelength λ and propagation distance z always appear in pairs (aside from the trivial global phase factor), which means that a change in wavelength demonstrates equivalent effect on the field propagation as a change in propagation distance [8, 12, 14]. For example, for a non-dispersive specimen (which is valid for most biological samples) with object-to-sensor distance z, if the illumination wavelength is changed from λ to λ + Δλ, the recorded diffraction pattern has an effective wavelength dependent distance zeff = zΔλ/λ. Compared with conventional multi-depth methods, the multi-wavelengths strategy eliminates the need for any active elements and mechanic operation during image capture, thus greatly simplifies the system design and improves the image acquisition speed. Once three diffraction patterns at different illumination wavelength (red, green, blue) are captured, the effective defocus distances of the red and blue channel images are converted according to zr,b = zr,b = zλr,bg, using the green channel image as the central reference image (z = zg). Then the longitudinal intensity derivative ∂I/∂z can be estimated through the following (unequally spaced) finite difference formula:
Iz=(Δzb)2(IrIg)(Δzr)2(IbIg)ΔzrΔzb(ΔzbΔzr),
where Δzr,b are effective propagation distances from the red/blue channel image to the green channel image, respectively (Δzr,b = zr,bz). The finite difference yields an accurate and acceptable estimate of the intensity derivative only if the effective propagation distances between the three channel, Δzr,b, are small, which in turn requires a small separation between the object and the image sensor. With the estimated intensity derivative, combining with the central green channel image Ig, the phase distribution at the image plane can be retrieved by solving the TIE using fast discrete cosine transform (DCT) under homogeneous Neumann boundary conditions [29]. First, an auxiliary function ψ satisfying ∇ψ = Iϕ is introduced to convert the TIE into the following two Poisson equations:
kIz=2ψ,
and
(I1ψ)=2ϕ.

By solving the first Poisson equation [Eq. (4)], we can get the solution for ψ, thus the phase gradient can be obtained (since I1ψ = ∇ϕ). The second Poisson equation [Eq. (5)] is used for phase integration. By solving the two Poisson equations with use of fast DCT [29], the phase ϕ can be uniquely determined apart from an arbitrary additive constant (which is trivial for QPI). In this work, the TIE is solved under simplified homogeneous Neumann boundary conditions (assuming zero phase change at the image boundary), which is reasonable since our imaging FOV is quite large and most of samples are isolatedly distributed within the FOV, thus the disturbance from the very few samples located at the edge of the image can be safely neglected. Note for a more general and rigorous solution of the TIE, the inhomogeneous Neumann boundary conditions should be used, and one additional hard-edged aperture at the object plane is required to generate the inhomogeneous boundary signals [2931].

Generally, the phase retrieved by the TIE is considered as quantitative. However, it has been shown that the TIE method has its inherent limitations as a tool for phase retrieval, and sometimes fails to provide an accurate solution that coincide with the exact phase [3235]. The phase errors originate from the nonlinear components related to the finite difference approximation [Eq. (3)] [32, 33], and the phase discrepancy associated with the TIE solvers [3436]. To compensate these inaccuracies, the deterministic TIE-reconstructed phase is used as an initial input for the Gerchberg-Saxton-type iterative phase refinement [7, 8]. Conventionally, the Gerchberg-Saxton iterative phase retrieval algorithm starts from a randomly distributed phase, and then propagate the complex field back and forth between different planes, replacing the intensity with the measurement values until the whole process converges. Due to the significant difference between the initial values (randomly distributed phase) and the true phase to be solved, the iterative process often exhibits slow convergence (needs hundreds of iterations) and stagnation [37]. However, instead of using a random guess, in our work, we use the TIE-reconstructed phase as the initial input for the Gerchberg-Saxton-type iterative phase retrieval algorithm. The TIE reconstruction provides a very good initial approximation of the phase, thus the convergence rate is significantly improved (only 5 RGB iterations are used in this work) and the stagnation problems associated with the Gerchberg-Saxton-type iterative algorithm can be effectively avoided. Finally, the complex amplitude signal retrieved from the green channel intensity image and the phase after iterative refinement, is propagated back to the object plane to bring the image back into focus. The whole phase reconstruction process as well as the reconstruction results of a ovary of lilium is illustrated in Fig. 2. Figure 2(a) shows the raw diffraction patterns at different illumination wavelength (red, green, blue). Though the object to sensor distance is relatively small (1.6mm), the out-of-focus blurring can be easily observed in the magnified regions of the raw images [Fig. 2(a)]. After the TIE phase retrieval and 5 iterations phase refinement, the reconstructed complex field was propagated back to the object plane, resulting in a sharp in-focus image with all out-of-focus blurring effectively removed, as shown in Fig. 2(e). This flexible ability to digitally adjust the ‘z’ parameter significantly extends the depth-of-field (DOF) of our platform compared to other imaging systems where conventional microscope objectives are used.

 figure: Fig. 2

Fig. 2 The schematic diagram of the multi-wavelength phase retrieval and image reconstruction. See text for details. The sample is illuminated at different illumination wavelengths (R,G,B) (a), and the green channel image combining with axial intensity derivative estimated are used for TIE phase reconstruction (b). The TIE-reconstructed phase is then refined by a Gerchberg–Saxton–type iterative phase retrieval algorithm (c). Finally, the retrieved complex field is propagated back to the object plane (d) to get a sharp in-focus image, as shown in (e). Scale bar 400μm.

Download Full Size | PDF

To further investigate the imaging resolution of our platform, Fig. 3 illustrates the reconstruction result of a 2μm silica bead in water. For a standard lens based imaging systems, greater lateral resolution is achievable by increasing the numerical aperture (NA) of the imaging optics. However, In our lensless geometry, by use of a small object-to-sensor distance, the collection NA at the detection plane approaches to 1. Thus, the spatial resolution of our reconstructions is ultimately limited by the pixel size of the sensor (2.2μm). To reduce the mosaic effect due to the insufficient resolvability of the imaging sensor, the raw diffraction patterns are upsampled by a factor of five, using cubic spline interpolation before the phase retrieval and reconstruction procedure [Figs. 2(b)]. We wish to emphasize that for such small-sized specimen, the advantages of interpolating the diffraction patterns before imaging reconstruction are considerable. First, interpolation significantly alleviates the pixelation effect in original diffraction patterns as well as the final reconstruction, allowing defining a more accurate and smooth object support. Second, interpolating the diffraction patterns before phase reconstruction further allows the imaging resolution to be improved by the phase retrieval algorithm. Though interpolation does not introduce any additional information content of the raw image initially, after the TIE reconstruction and the iterative phase refinement, some higher spatial frequency components gradually appear, which allows the resolution improvement in the final reconstruction. As shown in Figs. 3(c) and 3(d), the FWHM value of the transverse section of the reconstructed bead is 3.72μm, suggesting an improved resolution compared with the maximum resolution capability offered by the sensor pixel size (4.4μm, twice of the detector pitch). Furthermore, the small object-to-sensor distance (magnification 1) also leads to a significant large imaging FOV. To demonstrate the wide-field imaging capability of our system, we imaged a whole unstained cheek cells sample. The sample was obtained from a cheek smear, diluted into a water droplet then spread on a cover glass. The entire FOV (~ 24mm2) as captured by the CMOS sensor is shown in Fig. 4, with the reconstructed intensity and phase images of three different regions within the entire imaging area depicted in the surrounding insets. In these reconstructed images, the cell morphology is clear and their boundaries can clearly be seen and separated from the background. Especially the phase images provide much clearer information about the actual structure of the specimen, with the optically thick nucleus and wrinkled cell membrane shown with high contrast. These results demonstrate our high-throughput platform can yield intensity and quantitative phase imaging with a lateral resolution of below 4μm over a large FOV of 24mm2. It should be noted that the lateral resolution and FOV can be further improved by simply using larger image sensors with a smaller pixel size.

 figure: Fig. 3

Fig. 3 Resolution analysis and reconstruction result of a 2μm silica bead: (a) recorded diffraction pattern (Red channel), (b) upsampled version of (a), (c) reconstructed in-focus intensity image, (d) Axial and transverse profiles (dotted black and dashed red lines) of the volumetric reconstruction. The FWHM values for the lateral line-profile is 3.72μm, while the axial FWHM is approximately 51μm. (e) 3D representation of the reconstructed volumetric intensity distribution.

Download Full Size | PDF

 figure: Fig. 4

Fig. 4 Quantitative phase reconstruction of cheek cells. The central image shows the captured raw diffraction pattern over entire FOV (~ 24mm2). The surrounding sub-images shows the reconstructed intensity and phase images of three different regions (corresponding to the green, red, and blue boxed areas) within the entire imaging area.

Download Full Size | PDF

4. Diffraction tomography with multi-angle illuminations

Though the complete knowledge of the complex optical field permits computing the intensity fields in parallel planes by use of numerical backward propagation, the 3D distribution of the object wave-front reconstructed suffers from a extended axial focal region that the reconstructed objects do not appear well localized but are prolonged in z-direction. As shown in Fig. 3(e), the volumetric intensities reconstruction of 2μm bead demonstrates an elongated tail along the z direction. The FWHM value of the axial line profile along the sphere’s central position is about 50μm, suggesting a significantly lower depth resolution compared with the lateral resolution [Fig. 3(d)]. In order to retrieve depth-resolved images, we can make full use of the 8 by 8 LED array for sample illuminations. Each LED element illuminates the sample from an oblique incident angle, and the corresponding RGB diffraction images are acquired to retrieve the whole complex field information using the composite phase retrieval algorithm described in Section 3. For weakly scattering objects that the first Born or Rytov approximation is satisfied, the Fourier diffraction theorem [22] states that the 2D Fourier transform of the forward scattered complex field retrieved from one illumination angle cover a cap of sphere with same orientation in the 3D Fourier space, so-called Ewald sphere [19, 20]. If only one illumination angle is used, the retrieved complex field only occupies very limited support with a negligible thickness in the 3D spatial frequency space, which explains the poor longitudinal imaging capabilities shown in Fig. 3(e). However, by using various illumination directions, a finite volume of 3D reciprocal space can be filled by caps of spheres with various orientations, and true 3D imaging of weakly scattering specimens can thus be achieved. The total 64 (×3 channels) diffraction patterns are collected as the sample (2μm bead at z = 0.8mm) is successively illuminated at diffraction angles by each of the 64 RGB LEDs in the array, and nine typical diffraction patterns are illustrated in Fig. 5(a). With the change in illumination angles, the diffraction rings at the detector plane were stretched to elliptical shapes as expected, since detection plane is not normal to beam propagation. With our current configuration, we could get an image within illumination angles of up to ±45° for both x and y axis, which can be precisely determined by calculating the inverse tangent of the ratio Δd/z, where Δd denotes the lateral shifts of the holograms of objects with respect to their central positions. It should be noted that the diffraction tomography algorithm can be implemented with either the Born or Rytov approximations. For the Born approximation to be valid, a necessary condition is that the total phase shift introduced by the object should be small (typically less than π/2), which in turn requires the size of the object to be small and the refractive index contrast to be low [19, 38]. The Rytov approximation is less restrictive than the Born approximation: it is valid as long as the phase gradient in the sample is small and is independent of the sample size and the total phase shift [38, 39]. Since the phase delay induced by typical biological cell in the medium can easily exceed π/2 (e.g, the cheek cells shown in Fig. 4), we implement the diffraction tomography algorithm based on the Rytov approximation.

 figure: Fig. 5

Fig. 5 Tomographic reconstruction of the 2μm silica bead. An LED array sequentially illuminates the sample with different LED elements, and nine typical diffraction patterns captured are shown in (a). The retrieved complex fields are mapped in 3D Fourier space according to Fourier diffraction theory (b–c). Due to the limited pixel resolution abilities, the actual detectable frequencies of our platform only occupy a limited portion of the 3D Ewald sphere (d–e), while the information in the red shaded region is unrecoverable. An iterative non-negative constraint processing method is implemented for filling the rest of 3D space (f–g). And finally a 3D inverse Fourier transform yields 3D tomogram the bead (h). The FWHM value for the lateral line-profile is 3.41μm, while the axial FWHM shrinks to 5μm (i).

Download Full Size | PDF

Theoretically, for a imaging system with detection NA=1 with the full 64 incidences are used, the detected frequencies provide a reasonably large coverage in the kxky plane, filling a rounded corner square as shown in Fig. 5(b). The detected frequencies in the kykz plane take the form of the well-known butterfly-shaped support of the optical transfer function (OTF) for a transmission microscope [Fig. 5(c)] [40]. However, the actual detectable frequencies of our platform are almost fundamentally limited by the resolution capability offered by the sensor pixel size, as illustrated by the red shaded regions. Figures 5(d) and 5(e) show the results of mapping the retrieved complex fields of the 2μm bead at the sample plane on the (kx,ky,kz = 0) and (kx = 0,ky,kz) planes, respectively. In principle, the larger angular coverage and more illumination angles are used, the larger spatial frequency support of the 3D spectrum can be measured, and thus the ill-posedness of the inverse problem can be alleviated. This can be achieved by using a LED matrix with a larger array size and a smaller pitch. However, three important aspects regarding the LED illuminations should be taken into consideration in practice. First, with the increase in the number of the LEDs, the required acquisition time as well as the size of the system will be increased accordingly. Furthermore, for high angle LEDs, the illumination intensities decrease dramatically due to their larger distance to the sensor. Finally, decreasing the distance between the LED array and the object can increase the illumination intensity as well as the maximum angular coverage, but it also decreases the spatial coherence of the illumination and thus reduces the fringe contrast of the captured diffraction patterns. Therefore, our current configuration uses an 8 × 8 LED matrix located at 3.25cm to the sample plane to achieve a reasonable tradeoff among these factors (acquisition time, system size, illumination intensity, angular coverage, and spatial coherence). Due to the limited number and coverage of the illumination angles, the frequency space has some missing information, which can be further filled by using an iterative non-negative constraint algorithm [19]. During the iterative process, the scattering potential is updated by incorporating the new information generated from the additional constraints. After 50 iterations, the frequency space including the missing region is more uniformly filled [Figs. 5(f) and 5(g)]. Finally, by taking the inverse Fourier transform of the entire 3D frequency spectrum, we obtain the depth-resolved 3D intensity tomogram (imagery part of the complex refractive index), resulting in a slightly elongated blob directed along the z axis [Fig. 5(h)]. The FWHM values for the line profiles suggest that an axial resolution about 5μm can be achieved through the tomographic reconstruction, which has been significantly improved compared to the single-angle-illumination case [Fig. 5(i)].

In this work, since we mainly focused on the proof-of-concept, the acquisition and processing speed has not yet been fully optimized. In the current setup, our camera limits the frame rate of acquisition (6 frames/second). Capturing all images at whole 64 illuminations (totally 3×64 = 192 images) takes 32 seconds. The time for image acquisition can in fact be reduced to less than 10 seconds if either a reduced region-of-interest (ROI) or a camera with a higher frame rate is applied. Implementing 2D fast Fourier transform on a graphics processing unit, the time required for phase retrieval and back propagation (for each illumination angle) of the whole FOV (2592×1944) is approximately 1200 ms on a laptop computer (Intel Core i7-4700 CPU, 2.4 GHz, 8 GB RAM, NVidia GeForce GT750M GPU). Therefore, for quantitative phase imaging and digital refocusing with only orthogonal illumination, the current processing speed is enough to visualize the slow dynamics of microscopic biological samples in real time. For tomographic imaging, due to the large size of the dataset, the computation time for reconstructing a typical diffraction tomogram (512×512×512 voxels) is approximately 1.5 mins.

Next, to demonstrate the tomographic imaging ability of our lensless microscope, we imaged 3μm Polystyrene beads distributed randomly in an approximately 50μm–thick chamber filled with immersion oil located at a height of ~ 600μm from the sensor surface. Figure 6(a) shows the raw full FOV image captured with vertical illumination. The reconstruction was only carried on a small ROI cropped from the whole FOV, labeled by the blue box shown in Fig. 6(b). As we mentioned earlier, with only single vertical illumination, the backward-propagated reconstruction lacks z-sectioning ability, and thus cannot resolve two particles that stacked along the optical axis. While using multiple illuminations avoids this problem and results in a perfect depth sectioning even in the presence of axially overlapped beads. As shown in Fig. 6(b), the region highlighted by the red-dotted box contains two random beads that almost axially overlap with a center-to-center separation of approximately 25μm in z-direction. However, since the distances of the two beads to the sensor surface are different, the corresponding diffraction patterns laterally shifted and thus the overlap reduces when the oblique illumination is used, as shown in the inset of Fig. 6(b). In Fig. 6(c), the back-propagated intensity distribution reconstructed from the orthogonal illumination at z = 0 plane (reconstruction distance 622μm from the sensor surface) shows all beads (labeled as A–F) and the axially overlapped beads (E and F) cannot be resolved. Figures 6(d) and 6(e) show two intensity z sections (z = 0 and z= −20μm), and Fig. 6(f) shows the whole volumetric intensity distribution of the tomographic reconstructions with full 64 illuminations. From there results, it is clear that the out-of-focus beads are successfully rejected from each z slice and the two axially overlapped beads, E and F, are clearly separated at their corresponding depth.

 figure: Fig. 6

Fig. 6 Tomographic reconstruction of randomly distributed 3μm beads. (a) Raw full FOV image captured with vertical illumination. (b) A small ROI cropped from the whole FOV, the inset shows the diffraction pattern of the region highlighted by the red-dotted box under oblique illumination (by turning on the top-left corner LED). (c) Back-propagated intensity image with vertical illumination. (d,e) Tomographically reconstructed z-sections at 0 and −20μm, respectively. (f) 3D volometirc distribution of the six beads.

Download Full Size | PDF

To further prove the quantitativeness of the reconstruction with our tomographic microscope, we measured the 3D refractive index of a larger (diameter D = 20μm) gentamicin-polymethylmethacrylate (PMMA) bead (nPMMA = 1.49) immersed in a transparent matching refractive index liquid (glycerol, nmedium = 1.473). Figure 7(a) shows the quantitative phase retrieved at orthogonal illumination, from the three intensity images captured at different wavelengths (R,G,B as shown on the left). The phase retrieved fits reasonably well with the ideal model: a half-sphere profile with a maximum phase shift of ϕcenter=2πλgD(nPMMAnmedium)4.1, as shown in Fig. 7(b). Using the set of phase and amplitude images retrieved at various angles of illumination, we applied the diffraction tomography algorithm described earlier in this section. The iterative algorithm took 50 iterations to fill the missing information, resulting in an intact 3-D Fourier spectrum with some ring patterns clearly visible in the central part. Figures 7(c) and 7(d) show the cross-sectional slices of refractive index distribution of the bead in the x-y and y-z planes, verifying that our method successfully reconstructed tomograms of the PMMA bead with correct refractive index values. It should be also noted that the shape distortion in the axial direction is larger than the lateral dimension, which is expected due to the limited coverage of the illumination angles.

 figure: Fig. 7

Fig. 7 3D refractive index measurement of a 20μm PMMA bead. (a) Quantitative phase retrieved at orthogonal illumination, from the three intensity images captured at different wavelengths (shown on the left); (b) Phase profile along the center of the bead, comparison between measurement and ideal model; (c-d) Cross-sectional slices of refractive index distribution of the PMMA bead in the x-y and x-z planes, respectively. The kxky and kykz slice of the object function in 3D Fourier space after the iterative non-negative constraint algorithm are shown on the left of (c).

Download Full Size | PDF

Finally, to demonstrate the performance of our lensless tomographic microscope for applications in life sciences, we imaged a deparaffinized and cover-slipped slice of the uterus of Parascaris equorum in deionized water. In Figs. 8(a) and 8(b), the recovered refractive index and absorption coefficient depth sections at z= −10.6, 0, and 26.4μm are shown. For comparison, we also show the three intensity distributions reconstructed from the orthogonal illumination at the same depths in Fig. 8(c). The x-z views of respective 3D stacks, cut along the red dashed line in Figs. 8(a) are shown on the second row. Figure 8(c) emphasizes the rather low optical sectioning capability of the single-angle illumination reconstruction since no significant difference can be observed from the three different intensity images. The elongated tails shown in the corresponding x-z view suggest that the reconstruction is disturbed by out-of-focus blurring, which can be expected, as the single-angle illumination scheme has virtually no z-resolving power, therefore all out-of-focus objects strongly contribute to the final image. In contrast, with use of the tomography scheme described, each slice shows only the cell content at the one certain layer while rejects the information distributed at the out-of-focus layers. Notably, one tiny dirt at the top of the cover-slip (at z=158.4μm) was clearly resolved in the tomographic reconstruction, while not resolvable in single-angle illumination result [see zoomed regions in Figs. 8(b) and 8(c)]. In the x-z cross-sections of the tomographic reconstructions, several isolated scattering centers can be distinguished at different heights. In can be seen that the egg cells distributed roughly within one layer (approximately ±15μm range in z-direction), and the dirt particle can be easily identified (pointed out by the red arrows) since it is located at a much higher layer compared with the egg cells. The out-of-focus blur, though not fully eliminated, is confined in very shallow funnel-shaped regions. To show more details of fertilized eggs inside the sample, Figs. 8(d) and 8(e) display the 3D renderings of the refractive index and absorption corresponding to the boxed area in Fig. 8(a). Note the 3D absorption distribution is illustrated in reversed contrast, with the regions with high absorption rendered in bright color. By simply thresholding based on the absorption values, the 3D distribution of nuclei inside the fertilized eggs can be obtained as shown in Fig. 8(f), from which the nuclear division phenomenon for cells that undergoing the late (anaphase, or telophase) stages of the mitosis can be clearly identified.

 figure: Fig. 8

Fig. 8 3D tomographic reconstruction of a slice of the uterus of Parascaris equorum. The first row shows the recovered refractive index depth sections (a), absorption coefficients depth sections (b), and intensity distributions reconstructed from the orthogonal illumination (c) at z= −10.6, 0, and 26.4μm. The second row shows the corresponding x-z views of 3D stacks (scale ratio of z to x axis is 1:2), depicted by the red dashed line in (a). The arrows point out the dust particle located at a higher layer (at z=168.4μm). The third row shows the 3D renderings of the refractive index (d) ( Media 1), absorption distribution before (e), and after thresholding (f) ( Media 2) for the boxed area in (a). Scale bar 400μm.

Download Full Size | PDF

5. Conclusions

In conclusion, we have demonstrated lensless quantitative phase microscopy and diffraction tomography based on a compact on-chip platform, using only a CMOS image sensor and a programmable color LED matrix. Based on the multi-wavelength phase retrieval and the multi-angle illumination diffraction tomography, this platform offers high quality, depth resolved images with a lateral resolution of 3.72μm and an axial resolution of 5μm, over a wide imaging FOV of 24mm2. Note the resolution and FOV can be further improved by using a larger image sensors with small pixels straightforwardly. Thanks to its simple and lensless configuration, the whole platform is quite compact, light-weight, and cost-effective. It is expected to be a promising non-invasive imaging approach to provide point-of-care diagnostic tools even in remote, undeserved locations, where advanced laboratory facilities are not available or difficult to access. In our future work, we would like to further improve the system to make it entirely compatible with the standard practices of cell culture, accommodating mostly used standard culture dishes. This may open up new possibilities of using our lensless imaging approach for high-throughput continuous cell culture monitoring inside standard incubator for extended period of observation.

Acknowledgments

This work was supported by the Fundamental Research Funds for the Central Universities ( 30915011318), and Open Research Fund of Jiangsu Key Laboratory of Spectral Imaging and Intelligent Sense ( 3092014012200417). C. Zuo thanks the support of the ’Zijin Star’ program of Nanjing University of Science and Technology.

References and links

1. G. Popescu, Quantitative Phase Imaging of Cells and Tissues (McGraw Hill Professional, 2011).

2. P. Marquet, B. Rappaz, P. J. Magistretti, E. Cuche, Y. Emery, T. Colomb, and C. Depeursinge, “Digital holographic microscopy: a noninvasive contrastimaging technique allowing quantitative visualization of living cells with subwavelength axial accuracy,” Opt. Lett. 30, 468–470 (2005). [CrossRef]   [PubMed]  

3. C. Mann, L. Yu, C.-M. Lo, and M. Kim, “High-resolution quantitative phase-contrast microscopy by digital holography,” Opt. Express 13, 8693–8698 (2005). [CrossRef]   [PubMed]  

4. Z. Wang, L. Millet, M. Mir, H. Ding, S. Unarunotai, J. Rogers, M. U. Gillette, and G. Popescu, “Spatial light interference microscopy (slim),” Opt. Express 19, 1016–1026 (2011). [CrossRef]   [PubMed]  

5. B. Bhaduri, H. Pham, M. Mir, and G. Popescu, “Diffraction phase microscopy with white light,” Opt. Lett. 37, 1094–1096 (2012). [CrossRef]   [PubMed]  

6. B. Bhaduri, K. Tangella, and G. Popescu, “Fourier phase microscopy with white light,” Biomed. Opt. Express 4, 1434–1441 (2013). [CrossRef]  

7. R. Gerchberg, “A practical algorithm for the determination of phase from image and diffraction plane pictures,” Optik 35, 237 (1972).

8. P. Bao, F. Zhang, G. Pedrini, and W. Osten, “Phase retrieval using multiple illumination wavelengths,” Opt. Lett. 33, 309–311 (2008). [CrossRef]   [PubMed]  

9. J. A. Rodrigo and T. Alieva, “Illumination coherence engineering and quantitative phase imaging,” Opt. Lett. 39, 5634–5637 (2014). [CrossRef]   [PubMed]  

10. M. R. Teague, “Deterministic phase retrieval: a green’s function solution,” J. Opt. Soc. Am. 73, 1434–1441 (1983). [CrossRef]  

11. S. S. Gorthi and E. Schonbrun, “Phase imaging flow cytometry using a focus-stack collecting microscope,” Opt. Lett. 37, 707–709 (2012). [CrossRef]   [PubMed]  

12. L. Waller, S. S. Kou, C. J. R. Sheppard, and G. Barbastathis, “Phase from chromatic aberrations,” Opt. Express 18, 22817–22825 (2010). [CrossRef]   [PubMed]  

13. C. Zuo, Q. Chen, W. Qu, and A. Asundi, “Noninterferometric single-shot quantitative phase microscopy,” Opt. Lett. 38, 3538–3541 (2013). [CrossRef]   [PubMed]  

14. D. W. E. Noom, K. S. E. Eikema, and S. Witte, “Lensless phase contrast microscopy based on multiwavelength fresnel diffraction,” Opt. Lett. 39, 193–196 (2014). [CrossRef]   [PubMed]  

15. D. Paganin and K. A. Nugent, “Noninterferometric phase imaging with partially coherent light,” Phys. Rev. Lett. 80, 2586–2589 (1998). [CrossRef]  

16. C. Zuo, Q. Chen, L. Tian, L. Waller, and A. Asundi, “Transport of intensity phase retrieval and computational imaging for partially coherent fields: The phase space perspective,” Opt. Laser Eng. 71, 20 – 32 (2015). [CrossRef]  

17. A. Barty, K. Nugent, A. Roberts, and D. Paganin, “Quantitative phase tomography,” Opt. Commun 175, 329 – 336 (2000). [CrossRef]  

18. W. Choi, C. Fang-Yen, K. Badizadegan, S. Oh, N. Lue, R. R. Dasari, and M. S. Feld, “Tomographic phase microscopy,” Nat Methods 4, 717–719 (2007). [CrossRef]   [PubMed]  

19. Y. Sung, W. Choi, C. Fang-Yen, K. Badizadegan, R. R. Dasari, and M. S. Feld, “Optical diffraction tomography for high resolution live cell imaging,” Opt. Express 17, 266–277 (2009). [CrossRef]   [PubMed]  

20. K. Kim, H. Yoon, M. Diez-Silva, M. Dao, R. R. Dasari, and Y. Park, “High-resolution three-dimensional imaging of red blood cells parasitized by plasmodium falciparum and in situ hemozoin crystals using optical diffraction tomography,” J. Biomed. Opt. 19, 011005 (2013). [CrossRef]   [PubMed]  

21. T. Kim, R. Zhou, M. Mir, S. D. Babacan, P. S. Carney, L. L. Goddard, and G. Popescu, “White-light diffraction tomography of unlabelled live cells,” Nat. Photon 8, 256–263 (2014). [CrossRef]  

22. E. Wolf, “Three-dimensional structure determination of semi-transparent objects from holographic data,” Opt. Commun. 1, 153–156 (1969). [CrossRef]  

23. O. Mudanyali, D. Tseng, C. Oh, S. O. Isikman, I. Sencan, W. Bishara, C. Oztoprak, S. Seo, B. Khademhosseini, and A. Ozcan, “Compact, light-weight and cost-effective microscope based on lensless incoherent holography for telemedicine applications,” Lab Chip 10, 1417–1428 (2010). [CrossRef]   [PubMed]  

24. S. O. Isikman, W. Bishara, S. Mavandadi, F. W. Yu, S. Feng, R. Lau, and A. Ozcan, “Lens-free optical tomographic microscope with a large imaging volume on a chip,” Proc. Natl. Acad. Sci. U.S.A. 108, 7296–7301 (2011). [CrossRef]   [PubMed]  

25. G. Zheng, S. A. Lee, Y. Antebi, M. B. Elowitz, and C. Yang, “The epetri dish, an on-chip cell imaging platform based on subpixel perspective sweeping microscopy (spsm),” Proc. Natl. Acad. Sci. U.S.A. 108, 16889–16894 (2011). [CrossRef]   [PubMed]  

26. G. Zheng, R. Horstmeyer, and C. Yang, “Wide-field, high-resolution fourier ptychographic microscopy,” Nat. Photon 7, 739–745 (2013). [CrossRef]  

27. X. Ou, R. Horstmeyer, C. Yang, and G. Zheng, “Quantitative phase imaging via fourier ptychographic microscopy,” Opt. Lett. 38, 4845–4848 (2013). [CrossRef]   [PubMed]  

28. L. Tian, X. Li, K. Ramchandran, and L. Waller, “Multiplexed coded illumination for fourier ptychography with an led array microscope,” Biomed. Opt. Express 5, 2376–2389 (2014). [CrossRef]   [PubMed]  

29. C. Zuo, Q. Chen, and A. Asundi, “Boundary-artifact-free phase retrieval with the transport of intensity equation: fast solution with use of discrete cosine transform,” Opt. Express 22, 9220–9244 (2014). [CrossRef]   [PubMed]  

30. C. Zuo, Q. Chen, H. Li, W. Qu, and A. Asundi, “Boundary-artifact-free phase retrieval with the transport of intensity equation ii: applications to microlens characterization,” Opt. Express 22, 18310–18324 (2014). [CrossRef]   [PubMed]  

31. L. Huang, C. Zuo, M. Idir, W. Qu, and A. Asundi, “Phase retrieval with the transport-of-intensity equation in an arbitrarily shaped aperture by iterative discrete cosine transforms,” Opt. Lett. 40, 1976–1979 (2015). [CrossRef]   [PubMed]  

32. L. Waller, L. Tian, and G. Barbastathis, “Transport of intensity phase-amplitude imaging with higher order intensity derivatives,” Opt. Express 18, 12552–12561 (2010). [CrossRef]   [PubMed]  

33. C. Zuo, Q. Chen, Y. Yu, and A. Asundi, “Transport-of-intensity phase imaging using savitzky-golay differentiation filter-theory and applications,” Opt. Express 21, 5346–5362 (2013). [CrossRef]   [PubMed]  

34. J. A. Schmalz, T. E. Gureyev, D. M. Paganin, and K. M. Pavlov, “Phase retrieval using radiation and matter-wave fields: Validity of teague’s method for solution of the transport-of-intensity equation,” Phys. Rev. A 84, 023808 (2011). [CrossRef]  

35. C. Zuo, Q. Chen, L. Huang, and A. Asundi, “Phase discrepancy analysis and compensation for fast fourier transform based solution of the transport of intensity equation,” Opt. Express 22, 17172–17186 (2014). [CrossRef]   [PubMed]  

36. A. Shanker, L. Tian, M. Sczyrba, B. Connolly, A. Neureuther, and L. Waller, “Transport of intensity phase imaging in the presence of curl effects induced by strongly absorbing photomasks,” Appl. Opt. 53, J1–J6 (2014).

37. J. R. Fienup and C. C. Wackerman, “Phase-retrieval stagnation problems and solutions,” J. Opt. Soc. Am. A 3, 1897–1907 (1986). [CrossRef]  

38. A. C. Kak and M. Slaney, Principles of Computerized Tomographic Imaging (Academic Press, 1999).

39. A. J. Devaney, “Inverse-scattering theory within the rytov approximation,” Opt. Lett. 6, 374–376 (1981). [CrossRef]   [PubMed]  

40. N. Streibl, “Three-dimensional imaging by a microscope,” J. Opt. Soc. Am. A 2, 121–127 (1985). [CrossRef]  

Supplementary Material (2)

Media 1: AVI (7627 KB)     
Media 2: AVI (5679 KB)     

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (8)

Fig. 1
Fig. 1 Lensfree microscopy and tomography platform. (a) Schematics explaining the principle of lensless imaging. Typical values: L = 32.5mm, z = 300μm~1.6mm. (b) Photograph of the microscope. The system is consisting of a CMOS imaging sensor and a LED matrix controlled by a MCU, where each LED can provide RGB narrow-band illumination. The whole device is powered through the USB connection.
Fig. 2
Fig. 2 The schematic diagram of the multi-wavelength phase retrieval and image reconstruction. See text for details. The sample is illuminated at different illumination wavelengths (R,G,B) (a), and the green channel image combining with axial intensity derivative estimated are used for TIE phase reconstruction (b). The TIE-reconstructed phase is then refined by a Gerchberg–Saxton–type iterative phase retrieval algorithm (c). Finally, the retrieved complex field is propagated back to the object plane (d) to get a sharp in-focus image, as shown in (e). Scale bar 400μm.
Fig. 3
Fig. 3 Resolution analysis and reconstruction result of a 2μm silica bead: (a) recorded diffraction pattern (Red channel), (b) upsampled version of (a), (c) reconstructed in-focus intensity image, (d) Axial and transverse profiles (dotted black and dashed red lines) of the volumetric reconstruction. The FWHM values for the lateral line-profile is 3.72μm, while the axial FWHM is approximately 51μm. (e) 3D representation of the reconstructed volumetric intensity distribution.
Fig. 4
Fig. 4 Quantitative phase reconstruction of cheek cells. The central image shows the captured raw diffraction pattern over entire FOV (~ 24mm2). The surrounding sub-images shows the reconstructed intensity and phase images of three different regions (corresponding to the green, red, and blue boxed areas) within the entire imaging area.
Fig. 5
Fig. 5 Tomographic reconstruction of the 2μm silica bead. An LED array sequentially illuminates the sample with different LED elements, and nine typical diffraction patterns captured are shown in (a). The retrieved complex fields are mapped in 3D Fourier space according to Fourier diffraction theory (b–c). Due to the limited pixel resolution abilities, the actual detectable frequencies of our platform only occupy a limited portion of the 3D Ewald sphere (d–e), while the information in the red shaded region is unrecoverable. An iterative non-negative constraint processing method is implemented for filling the rest of 3D space (f–g). And finally a 3D inverse Fourier transform yields 3D tomogram the bead (h). The FWHM value for the lateral line-profile is 3.41μm, while the axial FWHM shrinks to 5μm (i).
Fig. 6
Fig. 6 Tomographic reconstruction of randomly distributed 3μm beads. (a) Raw full FOV image captured with vertical illumination. (b) A small ROI cropped from the whole FOV, the inset shows the diffraction pattern of the region highlighted by the red-dotted box under oblique illumination (by turning on the top-left corner LED). (c) Back-propagated intensity image with vertical illumination. (d,e) Tomographically reconstructed z-sections at 0 and −20μm, respectively. (f) 3D volometirc distribution of the six beads.
Fig. 7
Fig. 7 3D refractive index measurement of a 20μm PMMA bead. (a) Quantitative phase retrieved at orthogonal illumination, from the three intensity images captured at different wavelengths (shown on the left); (b) Phase profile along the center of the bead, comparison between measurement and ideal model; (c-d) Cross-sectional slices of refractive index distribution of the PMMA bead in the x-y and x-z planes, respectively. The kxky and kykz slice of the object function in 3D Fourier space after the iterative non-negative constraint algorithm are shown on the left of (c).
Fig. 8
Fig. 8 3D tomographic reconstruction of a slice of the uterus of Parascaris equorum. The first row shows the recovered refractive index depth sections (a), absorption coefficients depth sections (b), and intensity distributions reconstructed from the orthogonal illumination (c) at z= −10.6, 0, and 26.4μm. The second row shows the corresponding x-z views of 3D stacks (scale ratio of z to x axis is 1:2), depicted by the red dashed line in (a). The arrows point out the dust particle located at a higher layer (at z=168.4μm). The third row shows the 3D renderings of the refractive index (d) ( Media 1), absorption distribution before (e), and after thresholding (f) ( Media 2) for the boxed area in (a). Scale bar 400μm.

Equations (5)

Equations on this page are rendered with MathJax. Learn more.

U z ( x , y ) = exp ( j k z ) j λ z U 0 ( x , y ) exp { j π λ z [ ( x x 0 ) 2 + ( y y 0 ) 2 ] } d x 0 d y 0 ,
k I ( x , y ) z = [ I ( x , y ) ϕ ( x , y ) ] ,
I z = ( Δ z b ) 2 ( I r I g ) ( Δ z r ) 2 ( I b I g ) Δ z r Δ z b ( Δ z b Δ z r ) ,
k I z = 2 ψ ,
( I 1 ψ ) = 2 ϕ .
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.