Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

High-speed multi-wavelength Fresnel diffraction imaging

Open Access Open Access

Abstract

We demonstrate a compact lensless microscope which can capture video-rate phase contrast images of moving objects and allows numerical scanning of the focal distance after recording. Using only an RGB-detector and illumination from a single mode fiber, diffraction patterns at three wavelengths are recorded simultaneously, enabling high-speed data collection and reconstruction of phase and amplitude. The technique is used for imaging of a moving test target, beads in a flow cell, and imaging of Caenorhabditis elegans moving in a droplet of liquid.

© 2014 Optical Society of America

1. Introduction

Compact microscopes consisting of few parts are very desirable for many applications. One can think of microscopy in difficult to reach places or harsh environments, or applications where one would like to employ many microscopes in parallel. It is therefore important to get the most information out of the most basic imaging setups [15]. We describe a system for fast lensless phase-contrast microscopy that requires only a three-wavelength source, an RGB-detector and a computer, but is still able to reconstruct the complex refractive index of a sample using images taken in a single shot.

We have already shown that it is possible to use diffraction patterns at three different wavelengths to retrieve phase and amplitude information from a sample with very little constraints on the imaging setup [6]. That method required multiple images to be taken of a sample while it was illuminated with different wavelengths transmitted through the same optical fibre. Here we show that the method can be used to take single shot images by using an RGB-detector and light sources at suitable wavelengths, thus enabling the imaging of moving objects. Three diffraction patterns are measured at the same time and a few iterations of a phase retrieval algorithm reconstruct the phase and amplitude of the sample.

2. Fresnel diffractive imaging

An RGB image detector only measures the intensity at different pixels, but for reconstruction of an image, the phase is also needed. Retrieval of the phase of the electric field is possible using holographic methods [7], intensity changes in different planes [8], or iterative retrieval algorithms [9]. The change in intensity when the light field propagates, or when the illuminating wavelength changes, depends on the complex phase curvature of the wavefront. This information can be extracted from multiple diffraction patterns using a phase reconstruction algorithm. One way to do this is by recording propagation effects of the diffraction pattern from an object by taking multiple images at different distances [10, 11]. However, in Fresnel diffraction, the distance from the sample to the detector and the wavelength of the illuminating source play a similar role [12,13]. By measuring diffraction patterns at different wavelengths we can measure how the intensity propagates without changing the position of the detector. The requirement for this to work is that the sample has only a limited wavelength dependence of both the refractive index and absorption.

While there are many other algorithms used for phase reconstruction, our multi-wavelength approach requires no additional information and converges robustly for many different samples. The sample does not need to be isolated from its surroundings and in this paper we show it can even be a complex target moving in a fluid. The only information required for a reconstruction, except for the images, is the approximate wavelengths of the light used to illuminate the sample. After reconstruction the images can be refocused to any plane because of the quantitative phase information.

3. Setup

To reconstruct an image, we need diffraction patterns at three different wavelengths. With a monochrome image sensor this requires three detected frames, but using an RGB image sensor and suitable laser wavelengths this can be done in a single shot. To get an RGB-source we used fiber-coupled diode lasers at wavelengths of 402 nm, 519 nm and 636 nm, and employed fiber-coupled 50/50 beamsplitters as a simple way of beam combining. While this approach introduced losses, the laser intensities are still high enough to allow fast imaging. The combined beams are coupled into a final single mode fiber, which could be directed at a sample. We put our RGB image sensor, an IDS UI-5582LE-C CMOS camera with 2560×1920 pixels of size 2.2 μm and a color depth of 12 bits, after the sample to record diffraction patterns in transmission. A schematic of the setup is shown in Fig. 1. Only 3 microwatt of power per wavelength illuminates the sample, which requires an exposure time of 1 ms to achieve image saturation when the image sensor is placed 5 mm from the fiber. To obtain the phase of the electric field from an RGB image, we split the image into its three channels. With the three separate images we then apply the same reconstruction procedure as in [6].

 figure: Fig. 1

Fig. 1 The imaging setup, where a flow cell with beads is taken as an example sample. The wavelengths λ1, λ2, λ3 are obtained from laser diodes at 636 nm, 519 nm and 402 nm, respectively, coupled in a single-mode fiber. BS: fiber beamsplitter/combiner.

Download Full Size | PDF

CMOS sensors require extra care when triggered operation is required. Our chip has a ”rolling shutter”, where bottom pixel rows stop readout later than top rows. The readout of each horizontal row of pixels can be started simultaneously by using a ”global start” option. This means the bottom rows will have a shorter exposure time than the top rows, making it harder to optimize exposure for a good dynamic range and a maximum signal to noise ratio. Furthermore, to reduce motion-induced blurring we want to have the shortest exposure time possible. To meet this demand, and to get an even exposure over the whole chip we pulse the lasers for a much shorter time than the shortest gate time of the CMOS chip. The framerate of the measurements is limited by the readout rate of the camera, and the exposure time for a single frame is limited by the amount of light needed for a good image (typically requiring a laser pulse length of 1 ms).

The algorithm is implemented in Python on a graphics processing unit (GPU) using py-CUDA. A single iteration of the algorithm for an image of 1 megapixel takes about 35 ms on a Geforce GTX780 GPU. Processing the images with two RGB iterations - which suffices for raw alignment - takes 82 ms, including overhead for transferring images from the host to the GPU memory. In theory this enables reconstruction at 12 Hz. The speed could be further optimized by writing the source in a programming language like C++, or by using a faster graphics card.

4. Moving USAF test target and beads in a flow cell

In our first experiment we show images of a fast moving target with high resolution. We mounted a USAF test target on a piezo stage and let it move with a sinusiodal pattern at 0.5 Hz with an amplitude of 100 μm. To achieve a high resolution we put the fiber tip close to the sample so it would be illuminated with a curved wavefront and project an enlarged diffraction pattern on the camera. In Fig. 2 we show two frames from the video which was captured at 18.5 Hz. From these images it is clear the movent does not blur the sample significantly, and high resolution images can be rendered at video rate with a lensless setup.

 figure: Fig. 2

Fig. 2 High speed refocusable lensless imaging of a moving USAF test target. (a), (b) Single shot images at different times. (c), (d) Reconstructed image at different times. In (c), the smallest features of group seven, 2.19 μm wide, are cleary visible. The scale bars are 50 μm wide. See Media 1 for the reconstructed video.

Download Full Size | PDF

The refocusing ability on moving particles is demonstrated by using beads passing through a flow cell. The height of this flow cell is 1 mm, which exceeds the depth of field of a single refocused image by more than an order of magnitude. In Fig. 3(a), five different time frames are shown at a single focal distance. Then in Fig. 3(b), we show five different focal distances at a single time frame. This shows that our presented method enables a full 2D scan over focal distance and time.

 figure: Fig. 3

Fig. 3 Beads with a diameter of 20 μm in a flow cell. (a) Images at different times at a single focal distance. (b) Images at different focal distances at a single point in time. The indicated focal distances are the numbers that have been used in the phase retrieval algorithm, which correspond to the geometric distance between object and camera scaled by a factor (1 + d/R) because of the finite wavefront curvature (see Discussion section for details). Media 2 shows a video of the moving beads with refocusing at a single time frame.

Download Full Size | PDF

5. Refocusable imaging of C. elegans

To demonstrate our method in a more complex environment, and make use of the refocusing capabilities, we used a sample of C. elegans swimming in a solution, shown in Fig. 4. The single-shot measurement is shown in Fig. 4(a), where the sample is illuminated with three lasers at the same time. In Fig. 4(b), a reconstruction using data obtained with the three wavelengths is shown. In this image, several reconstruction artifacts are visible. Figure 4(c) shows a reconstruction from the same data set, but now using only the red and green wavelength channels.

 figure: Fig. 4

Fig. 4 (a) Single shot diffraction pattern of C. Elegans. (b) Reconstructed intensity image using data at three wavelengths as input for the reconstruction algorithm. (c) Reconstructed intensity image using only the red and green wavelength channels. (d) Reconstructed phase image at the same focal distance as (c). (e) Intensity image at different focal plane, using the red and green wavelength channels for reconstruction. (f) Phase image at the same focal distance as (e).

Download Full Size | PDF

This two-wavelength reconstruction leads to a better image compared to the three-wavelength result. This indicates that the assumption of a small wavelength dependence of the complex refractive index is invalid for this sample. Figure 4(d) shows the reconstructed phase image for the two-wavelength reconstruction. In Figs. 4(c) and 4(d) it can be seen that not all worms are in focus, and Figs. 4(e) and 4(f) show an amplitude and phase reconstruction at a different focal plane. The obtained phase reconstructions provide quantitative information of the optical thickness of the sample, which is made especially clear by the worm at the top of the image which is folded back onto itself. With this object, a whole refocusable movie has been captured; a video using the two-wavelength reconstruction and the focal distance of Fig. 4(c) is shown in Media 3.

6. Discussion

As shown in the previous sections, diffraction imaging using just a single mode fiber as light source and an RGB chip for detection works very well. However, there are a few aspects that need closer inspection. One issue is that the technique requires coherent sources, which also means that diffraction patterns are visible from elements in the image that are not in focus. As a result, anything out of the focus plane will lead to clearly visible spurious diffraction rings in the reconstructed image.

Another potential issue is related to the use of an RGB-detector: pixels of different color are positioned next to each other, and therefore detect slightly shifted diffraction patterns on the scale of a pixel size. Furthermore, the different colored pixels have a significant detection efficiency for more than one source wavelength. However, we have seen no problems with reconstructions up to the smallest structures of our USAF test-sample, which have a width of 2.19 μm. When the detected images at different RGB-channels differ significantly because of effects other than propagation, like a color-dependent absorption and refractive index, it is possible the phase reconstruction fails. In this case, two out of three colors can be used to limit these effects.

Since our microscope geometry involves a curved wavefront for illumination, this curvature should be taken into account in the reconstruction algorithm. An interesting property that we find for the propagated diffraction patterns, is that a given diffraction pattern can either result from an object illuminated with a curved wavefront, or from a scaled version of the object located at a different distance. We looked into this aspect by using ray matrices [14], which have this form:

(x2x2)=(ABCD)(x1x1),
where x1 and x′1 are the input rays’ position and slope, respectively, and the outputs of an optical system are given by x2 and x′2. If we use a wavefront with radius of curvature R and propagate by a distance d, we get the following matrix equation:
(1d01)(101R1)=(1+dRd1R1).
This situation is illustrated in Fig. 5(a). We can instead consider a parallel beam of the same size by simulating a infinitesimally small telescope by multiplying the A term with 1+dR and the D term with A−1. If we then propagate by a distance d(1+dR):
(1d(1+dR)01)(1+dR00(1+dR)1)=(1+dRd0(1+dR)1),
we get the same A and B terms in the final matrix, or the same intensity distribution of the beam at the detector plane, illustrated in Fig. 5(b). Because of this similarity, the reconstruction will often converge to the latter situation, even though the first situation would describe the experiment. This still yields the correct image, but the true scale needs to be determined from the experimental geometry or a calibration measurement. In practice, we therefore calibrate this scaling factor (1 + d/R) from the recorded images, as this is equivalent to determining the wavefront curvature, but more convenient to implement numerically.

 figure: Fig. 5

Fig. 5 (a) Illumination of a sample of size L with a wavefront with a radius of curvature R and propagation by a distance d, which will have the same intensity distribution as (b), where the sample is illuminated by a flat wavefront and the sample size and propagation distance are magnified by (1+d/R).

Download Full Size | PDF

The known size of a sample can be used to calculate the magnification of the objects in focus. For the beads shown in Fig. 3 the magnification depends on the distance of the bead to the camera: because of the thickness of the flow cell, individual beads need to be refocused using different propagation distances. This leads to a higher magnification for beads that are further away from the camera. In this experiment, beads are observed with a magnification ranging from 4.1 to 4.4.

7. Conclusion

We have demonstrated a compact lensless microscope, employing a bare RGB-detector and three different illumination wavelengths delivered through a single-mode fiber, which has the capability to generate refocusable video images of complex objects with better than 2.2 μm resolution and video-like framerate. The exposure time is only limited by the intensity of the illuminating source, which makes it possible to view images of moving samples and live C. elegans without motion-induced artefacts. These refocusable video setups can be very useful when size constraints are an issue or when many cameras are to be used simultaneously, which is a situation that is often encountered in high-throughput imaging applications such as imaging flow cytometry. The present lensless RGB microscope can provide an alignment-free, cost-effective solution for various applications in life science and medical diagnostics.

Acknowledgments

This work is supported by the Foundation for Fundamental Research on Matter (FOM), which is part of the Netherlands Organisation for Scientific Research (NWO).

References and links

1. L. Tian, J. Wang, and L. Waller, “3D differential phase-contrast microscopy with computational illumination using an LED array,” Opt. Lett. 39, 1326–1329 (2014). [CrossRef]   [PubMed]  

2. I. Sencan, A. F. Coskun, U. Sikora, and A. Ozcan, “Spectral demultiplexing in holographic and fluorescent on-chip microscopy,” Scientific Reports 4, 3760 (2014). [CrossRef]   [PubMed]  

3. G. Zheng, S. A. Lee, Y. Antebi, M. B. Elowitz, and C. Yang, “The ePetri dish, an on-chip cell imaging platform based on subpixel perspective sweeping microscopy (SPSM),” Proc. Natl. Acad. Sci. USA 108, 16889–16894 (2011). [CrossRef]   [PubMed]  

4. W. Bishara, T.-W. Su, A. F. Coskun, and A. Ozcan, “Lensfree on-chip microscopy over a wide field-of-view using pixel super-resolution,” Opt. Express 18, 11181–11191 (2010). [CrossRef]   [PubMed]  

5. O. Mudanyali, D. Tseng, C. Oh, S. O. Isikman, I. Sencan, W. Bishara, C. Oztoprak, S. Seo, B. Khademhosseini, and A. Ozcan, “Compact, light-weight and cost-effective microscope based on lensless incoherent holography for telemedicine applications,” Lab Chip 10, 1417–1428 (2010). [CrossRef]   [PubMed]  

6. D. W. E. Noom, K. S. E. Eikema, and S. Witte, “Lensless phase contrast microscopy based on multiwavelength Fresnel diffraction,” Opt. Lett. 39, 193–196 (2014). [CrossRef]   [PubMed]  

7. E. Cuche, F. Bevilacqua, and C. Depeursinge, “Digital holography for quantitative phase-contrast imaging,” Opt. Lett. 24, 291–293 (1999). [CrossRef]  

8. M. Reed Teague, “Deterministic phase retrieval: a Greens function solution,” J. Opt. Soc. Am. 73, 1434–1441 (1983). [CrossRef]  

9. J. R. Fienup, “Phase retrieval algorithms: a comparison,” Appl. Opt. 21, 2758–2769 (1982). [CrossRef]   [PubMed]  

10. L. Allen and M. Oxley, “Phase retrieval from series of images obtained by defocus variation,” Opt. Commun. 199, 65–75 (2001). [CrossRef]  

11. A. Greenbaum and A. Ozcan, “Maskless imaging of dense samples using pixel super-resolution based multi-height lensfree on-chip microscopy,” Opt. Express 20, 3129–3143 (2012). [CrossRef]   [PubMed]  

12. T. Gureyev, S. Mayo, S. Wilkins, D. Paganin, and A. Stevenson, “Quantitative in-line phase-contrast imaging with multienergy X Rays,” Phys. Rev. Lett. 86, 5827–5830 (2001). [CrossRef]   [PubMed]  

13. K. A. Nugent, “X-ray noninterferometric phase imaging: a unified picture,” J. Opt. Soc. Am. A 24, 536 (2007). [CrossRef]  

14. H. Kogelnik and T. Li, “Laser beams and resonators,” Appl. Opt. 5, 1550–1567 (1966). [CrossRef]   [PubMed]  

Supplementary Material (3)

Media 1: MP4 (2014 KB)     
Media 2: MP4 (2023 KB)     
Media 3: MP4 (6800 KB)     

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (5)

Fig. 1
Fig. 1 The imaging setup, where a flow cell with beads is taken as an example sample. The wavelengths λ1, λ2, λ3 are obtained from laser diodes at 636 nm, 519 nm and 402 nm, respectively, coupled in a single-mode fiber. BS: fiber beamsplitter/combiner.
Fig. 2
Fig. 2 High speed refocusable lensless imaging of a moving USAF test target. (a), (b) Single shot images at different times. (c), (d) Reconstructed image at different times. In (c), the smallest features of group seven, 2.19 μm wide, are cleary visible. The scale bars are 50 μm wide. See Media 1 for the reconstructed video.
Fig. 3
Fig. 3 Beads with a diameter of 20 μm in a flow cell. (a) Images at different times at a single focal distance. (b) Images at different focal distances at a single point in time. The indicated focal distances are the numbers that have been used in the phase retrieval algorithm, which correspond to the geometric distance between object and camera scaled by a factor (1 + d/R) because of the finite wavefront curvature (see Discussion section for details). Media 2 shows a video of the moving beads with refocusing at a single time frame.
Fig. 4
Fig. 4 (a) Single shot diffraction pattern of C. Elegans. (b) Reconstructed intensity image using data at three wavelengths as input for the reconstruction algorithm. (c) Reconstructed intensity image using only the red and green wavelength channels. (d) Reconstructed phase image at the same focal distance as (c). (e) Intensity image at different focal plane, using the red and green wavelength channels for reconstruction. (f) Phase image at the same focal distance as (e).
Fig. 5
Fig. 5 (a) Illumination of a sample of size L with a wavefront with a radius of curvature R and propagation by a distance d, which will have the same intensity distribution as (b), where the sample is illuminated by a flat wavefront and the sample size and propagation distance are magnified by (1+d/R).

Equations (3)

Equations on this page are rendered with MathJax. Learn more.

( x 2 x 2 ) = ( A B C D ) ( x 1 x 1 ) ,
( 1 d 0 1 ) ( 1 0 1 R 1 ) = ( 1 + d R d 1 R 1 ) .
( 1 d ( 1 + d R ) 0 1 ) ( 1 + d R 0 0 ( 1 + d R ) 1 ) = ( 1 + d R d 0 ( 1 + d R ) 1 ) ,
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.