Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Simultaneous two-color imaging in digital holographic microscopy

Open Access Open Access

Abstract

We demonstrate the use of two-color digital holographic microscopy (DHM) for imaging microbiological subjects. The use of two wavelengths significantly reduces artifacts present in the reconstructed data, allowing us to image weakly-scattering objects in close proximity to strongly-scattering objects. We demonstrate this by reconstructing the shape of the flagellum of a unicellular eukaryotic parasite Leishmania mexicana in close proximity to a more strongly-scattering cell body. Our approach also yields a reduction of approximately one third in the axial position uncertainty when tracking the motion of swimming cells at low magnification, which we demonstrate with a sample of Escherichia coli bacteria mixed with polystyrene beads. The two-wavelength system that we describe introduces minimal additional complexity into the optical system, and provides significant benefits.

© 2017 Optical Society of America

1. Introduction

Digital holography [1, 2] has found a rich variety of applications in microscopy [3, 4], particularly in microbiology, where it allows fast three-dimensional imaging at spatial resolutions close to the diffraction limit. The morphology and behavior of subjects such as marine diatoms [5,6], sperm and blood cells [7–12], as well as other single-celled eukaryotes [13–18] and bacteria [19–25] have been examined, revealing the refractive index, structure and swimming patterns of these microorganisms. Digital holographic microscopy (DHM) has important advantages over comparable techniques such as confocal microscopy due to the very rapid frame rates that are possible (limited only by the speed of the camera), and by the extended depth of field that holography allows, which is ultimately limited by the coherence length of the light source and the specifications of the image sensor [4]. The performance of the latter is principally governed by the number of pixels and the physical pixel size, although for weakly-scattering subjects, the sensitivity of the camera and various sources of noise can become dominant. Readout noise in the detector electronics, and ultimately photon shot noise [26], provide fundamental limits on the signal-to-noise ratio (SNR) in DHM. Cooling CCD or CMOS detectors can go some way towards alleviating electronic noise sources: many modern cameras optimized for collecting weak signals from fluorescently-labeled biological specimens are equipped with this feature. The drawback of such devices is that sensitivity is typically achieved at the expense of speed, although frame rates of 100 Hz or greater at resolutions greater than one megapixel are now fairly standard. Photon shot noise is due to the statistical fluctuations in the number of photons detected, and results in an SNR that scales as 1/N, where N is the number of photons collected. A typical way to overcome photon shot noise is simply to increase the illumination level, although this is problematic in biological studies where high incident intensity can damage cells, or cause a change in behavior, particularly when short wavelengths are used [22].

Once a holographic image has been acquired, it can be treated in a number of different ways in order to extract useful information about a sample. Several authors [7, 27, 28] have used DHM as a tool to provide quantitative phase images, providing a map of a sample’s refractive index and inferring (for example) the height of structures along the optical axis. The use of dual wavelengths in the specific case of quantitative phase imaging has been shown to increase the vertical measurement range without phase ambiguity [29–31]. Here, we focus on a fundamentally different use of two-color DHM: imaging the position, orientation and shape of an object within a sample volume. Information can be extracted from holographic data either by fitting the images using known solutions to Maxwell’s equations, e.g. Mie scattering solutions for scattering by spheres [32, 33], or by a model-free approach such as Rayleigh-Sommerfeld back-propagation [34]. The latter is ‘model-free’ in the sense that it makes no a priori assumptions about the shape of the object being imaged. Instead, this reconstruction method models each pixel in the original image as a point source of waves, and computes the structure of the resulting optical field at some distance from the original image [2,5,34].

Multiple LEDs have been used in a tilted in-line configuration to image single bacterial cells in a previous study [35]. There, the authors used the tilted illumination to provide different holographic projections of objects even when they were closely spaced in the axial direction. The advantages of this elegant approach were demonstrated in a follow-up work exploring wall entrapment of swimming bacteria [25]. Our study differs from previous work in two important ways. Firstly, in terms of apparatus, our approach removes the need for a colour imaging sensor and simplifies the optical alignment through the use of a wavelength division multiplexer (WDM). Secondly, our feature extraction approach allows us to image extended, flexible objects in three dimensions, even in close proximity to more strongly scattering objects.

In a typical DHM experiment, the optical field is reconstructed at a number of distances from the original focal plane, yielding a stack of images. The final challenge from an optical point of view is to extract useful information from the resulting series of two-dimensional slices through the reconstructed field. A range of object localization strategies have been suggested for DHM, including those based on ‘autofocusing’ techniques. These determine an object’s location along the optical axis, by maximizing image sharpness in terms of spatial frequencies, local gradients or curvature [36, 37]. We use a method based on maximizing the local axial intensity gradient, which has proved useful for faithfully reconstructing extended objects in three dimensions [17]. This approach has limitations in that the processing stages can introduce artifacts close to the object of interest that complicate feature extraction (see results section). The problem is less important in situations where the object of interest has approximately homogeneous refractive index, but more challenging in situations where scatterers of different strengths are placed in close proximity. In this paper, we demonstrate how a two-color DHM setup can be used to tackle this issue when imaging the single-celled microorganism Leishmania mexicana. This species is a pathogen that is transmitted by sandfly bites. The cells are motile, and swim with the aid of a narrow, whip-like organelle known as a flagellum at the leading end of the more substantial cell body. The flagellum is of order 10 μm in length when a cell is motile, but this varies significantly over the course of the life cycle, which also includes non-motile stages [38]. The diameter of the flagellum varies much less, and is roughly 400 nm [39] during motile stages. It beats in a roughly sinusoidal fashion, pulling the cell through the surrounding medium. The flagellum is known to be important for lifecycle progression, providing a tempting target for novel therapies [40].

2. Experimental setup

Figure 1 shows the schematic layout of our two-color DHM setup. The illumination was provided by two fiber-coupled diode lasers with wavelengths λgreen = 515 nm and λred = 642 nm. These wavelengths were chosen such that they are within the visible range, for ease of alignment and manipulation with standard optical components; and not integer multiples of each other, which could cause the reinforcement of undesirable artifacts in the images. A WDM was used to couple the output of the lasers into a single fiber, ensuring that the illumination path was the same for both wavelengths. The system was implemented on an inverted Nikon Ti microscope, removing the standard (dry) condenser lens and replacing it with a custom holder compatible with ‘cage-mount’ optical components. The fiber carrying the laser illumination was held in the cage-mount system and directed down onto the sample from a distance of approximately 10 cm. Illuminating from this distance allowed the central, dominant fiber mode to significantly overfill the microscope’s field of view, leading to an approximately uniform intensity in the image plane. In this work, we used two objective lenses: a 20× lens (air, NA 0.50) for imaging and tracking a mixture of Escherichia coli cells and polystyrene beads; and a 60× lens (oil immersion, NA 1.40) for imaging the flagellum of L. mexicana cells. A tube lens offering unit magnification was used to focus the image onto a camera.

 figure: Fig. 1

Fig. 1 Optical layout for our simplified dual wavelength setup. A wavelength division multiplexer (WDM) is used to couple the two illumination wavelengths into a single fiber, which is directed onto a free-space coupler (FSC). An objective lens (OL) and tube lens (TL) produce an image of the sample at the microscope image plane, indicated as a vertical dashed line. The DualView apparatus (DV) images this plane with unit magnification onto a CMOS camera, spatially separating images created by the two illumination sources (see text).

Download Full Size | PDF

A ‘DualView2’ two-channel imaging system with unit magnification (Photometrics DV2 model) was used to laterally separate the images formed by the different wavelengths on a single (monochrome) CMOS sensor. The critical component in our DualView2 unit was a long-pass dichroic mirror (560 nm cut-off), which separated the image paths. This allowed the sample to be imaged in both wavelengths simultaneously, avoiding complications associated with chopping the beam or separating images from a color camera with wavelength-dependent spatial sampling frequency (e.g. the red/green channels from a Bayer-patterned color filter [41–43]). Although this component has benefits for allowing simultaneous imaging, it introduces a slight barrel distortion, necessitating an additional image registration step in the data processing.

The CMOS-based camera was a Mikrotron MC-1362 model configured to acquire an image measuring 936×900 pixels, a subset of the full 1280×1024 pixel sensor. The pixel spacing was 14 μm, yielding a final spatial sampling frequency (including optical magnification) of 1.42 or 4.29 pixels/μm, for 20× or 60× lenses, respectively. The camera was triggered at a frame rate of 50 Hz, with a 150 μs exposure using a ‘global shutter’ operation to ensure that all regions of the sensor were exposed simultaneously.

Two types of sample were imaged. The first was a mixed suspension of 0.5 μm diameter polystyrene spheres and E. coli bacteria. We grew the bacterial cells (strain HCB1 [44]) in tryptone broth to an optical density of 0.21, measured at λ = 600 nm in a spectrophotometer. These cells were diluted 1:100 for the experiments, to avoid multiple scattering in the sample volume. Sample chambers were constructed from glass slides and UV curing glue, creating a thin capillary channel measuring approximately 5 mm × 20 mm × 250 μm. The sample chambers were filled by capillary action and imaged immediately using a 20 × objective lens to track the cells. The microscope’s focal plane was located approximately 50 μm below the lower surface of the sample chamber (see below) and 1000 video frames were acquired. The second sample was a suspension of L. mexicana (strain M379) procyclic promastigote cells, grown in M199 media and harvested in mid logarithmic phase at concentrations of approximately 5 × 106 cells/mL. This stock sample was diluted by a factor of 100 into fresh medium for visualization. These were imaged using a 60× objective lens in order to resolve the thin, whip-like flagellum that extends from each cell body. In both samples, the movies were saved in an uncompressed format for post-processing.

3. Image registration

After dividing each raw movie into separate red and green sub-movies, images must be ‘registered’ by transforming one set onto the other, correcting the slight barrel distortion mentioned above. When this mapping is established, it can be applied to all frames in a video sequence to ensure that cells will be found in the same pixel location under both wavelengths.

A ‘background’ image Imed (x, y) was obtained from each sub-movie by finding the median pixel value at each location (x, y). Dividing the pixel values in each image in a stack by their background values removed many of the heterogeneities in the background illumination [34]. Example data are shown in Fig. 2, in which a mixture of E. coli bacteria and 0.5 μm diameter polystyrene beads were imaged. Figures 2(c) and (d) show the effect of dividing the raw data by the background image. To reconstruct the three dimensional optical field within the sample at each time point, Iim(x, y, z; t), we used Rayleigh-Sommerfeld back-propagation [34] to generate stacks of 150 refocused images from each frame, in steps of 2 μm from the focal plane (a smaller step size of 0.233 μm was used in the flagellar reconstruction experiments). The calculation of each image plane involved a multiplication in Fourier space, so we took advantage of this by applying a spatial bandpass filter at the same time, passing features between 1 and 30 pixels in size. To identify the objects within Iim(x, y, z; t), we implemented an object localization scheme based on the Gouy phase anomaly, as described elsewhere [45]. Briefly, this approach identifies regions where ∂Iim(x, y, z; t)/∂z is an extremum, as these locations are associated with weakly-scattering objects. Modern high-level programming languages (such as LabVIEW, in which we write our analysis code) offer optimized three-dimensional fast Fourier transforms, which we exploit in this work. We construct an ‘intensity gradient stack’ by the following convolution:

Ig(x,y,z,;t)=Iim(x,y,z;t)Sz(x,y,z),
where the Sobel-type gradient operator Sz (x, y, z) is the 3×3×3 array [46] given by:
Sz(x,y,0)=(121242121)Sz(x,y,1)=(000000000)Sz(x,y,2)=(121242121).

 figure: Fig. 2

Fig. 2 Example two-color holographic images of a mixture of E. coli cells and polystyrene beads (see text), showing intermediate processing steps. Panels (a) and (b) show raw holographic data. The images in panels (c) and (d) show the same data, but with the static background removed (see text). The bottom row shows the maximum intensity projection of the intensity gradient stacks Igrad (x, y, z; t); this data can be used either for image registration or for object localization. The scale bars represent 25 μm in all panels.

Download Full Size | PDF

Applying a suitable threshold to the projected image stack resulted in a series of bright points near the center of the objects (data not shown) that were used for image registration. We used a B-spline-based ‘Unwarp’ registration method [47] implemented in Fiji [48] to determine the optimum transformation to map the green data onto the red. Figure 3 shows example data from the red and green channels, superimposed and colored to reflect the channels. The green spots are slightly shifted to the right of the red ones, and the spots are further apart on the right-hand side of the image, indicating a barrel distortion. We note that the magnitude of the distortion is relatively small: the images are shifted by approximately 2.8 μm, and become stretched by a further 2.5 μm, smoothly across an image of around 240 μm, or a cumulative deformation of ∼ 1.5%. When the registration transformation had been calculated, we applied the same transformation to the raw (green) image stack, and continued with the rest of the feature extraction procedure. We found image registration in the axial direction to be unneccessary, as the alignment was already very good if the images were registered in the lateral directions before reconstruction.

 figure: Fig. 3

Fig. 3 Combining images of a mixture of E. coli cells and polystyrene beads (see text). (a) A superposition of images from red and green channels. The green image is shifted to the right, and suffers a slight stretching in the same direction. (b) A superposition of the red channel with the transformed green channel. The scale bar in panel (b) represents 30 μm. The scale in panel (a) is approximately the same, although red and green channels differ in scale by approximately 1.5% (see text).

Download Full Size | PDF

4. Object localization

After registration, we crop the image stacks to a region of interest of 256 × 256 pixels, in order to optimize the FFT operations. To minimize the total number of calculations, the Rayleigh-Sommerfeld propagator arrays (including bandpass filter) for each defocus distance, along with the three-dimensional array used to perform the axial intensity gradient operation, are calculated in advance and then applied to each frame in the video sequence. This resulted in a 256×256×150 element (voxel) array, Igrad(x, y, z; t), generated from each movie frame. After calculating Igrad (x, y, z; t) for both red and green channels, we multiply the arrays element-wise to arrive at a ‘joint’ gradient image stack Ijoint (x, y, z; t). This operation eliminates artifacts associated with the object localization technique as outlined below.

Working with relatively dilute samples allows us to project Ijoint (x, y, z) down onto a 2D plane, then use a standard ‘particle identification’ routine to locate the (x, y) centroid position of cells in our sample volume [36]. Example frames, taken from both color channels, are shown in Fig. 2, in which the cells are seen as the bright objects. The cells’ axial positions are then calculated from the full three-dimensional gradient stack by summing a 5×5 pixel region around each cell’s (x, y) centroid position in each z plane and locating the axial point of maximum intensity. While the position of objects in a fully three-dimensional gradient stack can be extracted, the computations are much faster when objects are identified first in two dimensions. The coordinates of the object(s) of interest are identified in each frame, and exported to text files, one per frame of video. These measurements of position are then combined across multiple frames in order to reconstruct the trajectories of cells.

5. Results

Figure 4 shows maximum intensity projections of our gradient image stacks, from red and green channels individually as well as the joint data. The pixel values in Figs. 4(a–c) are shown with a linear mapping from black for the minimum value, to white for the maximum value. The light blue lines in Fig. 4 represent line profiles of pixel values, which have been plotted in Fig. 4(d). The line profiles trace across the background and a ‘bright’ object of interest. To illustrate the improvement in noise characteristics, the data have been fitted with Gaussian curves, including an offset, and normalized so that the maximum of each curve is 1. This shows the significantly lower background level in the case when both wavelengths are combined (blue circles). Note that the vertical axis in panel 4(d) is on a log scale. The SNRs in this particular case are 11.2, 14.1 and 18.9 for the red, green, and combined cases, respectively.

 figure: Fig. 4

Fig. 4 Signal-to-noise ratio improvements using two-color DHM on a mixture of E. coli cells and polystyrene beads. Panels (a) and (b) show maximum intensity projections of gradient stacks from the red and green channels, respectively. Panel (c) shows the effect of multiplying the channels together. Panels (a–c) are displayed using a linear mapping from black for the minimum value, to white for the maximum value. Panel (d) shows a line profile along the path indicated with a light blue line in panels (a–c). The data in panel (d) have been scaled to have a maximum of 1, and fitted with a Gaussian curve (see text), to demonstrate improvements in the SNR. The scale bars in panels (a–c) represent 25 μm.

Download Full Size | PDF

Figure 5(a) shows a three-dimensional trajectory of a swimming E. coli cell, acquired using two-color DHM. The track is color-coded to show instantaneous swimming speed, and every fifth point is plotted for the sake of clarity. Within any bacterial population, there will be a spread of swimming speeds. The cell depicted here is among the slower and more erratic swimmers, but this is beneficial for our purposes because the cell remains inside the field of view for more frames leading to better statistics through a greater range of cell positions, speeds and orientations.

 figure: Fig. 5

Fig. 5 (a) A three-dimensional reconstruction of the swimming trajectory of bacterial cell, color-coded with instantaneous swimming speed. The total track duration is 40 seconds. (b) Five seconds of data showing the z-position as a function of time from the red channel only (bottom), the green channel only (middle, offset by 2 μm) and using both channels combined (top, offset by 5 μm). The black lines through the data points show the spline-smoothed track used to characterize the localization noise (see text). (c–e) Histograms of residuals when a spline fit is removed from the three-dimensional trajectory shown in (a). The one- and two-color systems achieve similar accuracy in the plane normal to the optical axis (Δx and Δy), but the two-color method reduces uncertainty in the axial coordinate (Δz), shown by the narrower histogram in panel (e).

Download Full Size | PDF

To quantify the amount of noise in each cell trajectory, we examine the deviations from a smoothed version of the track. Movement in each orthogonal direction was examined independently, and fit piecewise using cubic splines. The cells are undergoing Brownian motion, which will add some uncertainty in the absolute position measurement, but for an object the size of our cells (∼1 μm) in water, the displacement between frames due to Brownian motion (∼100 nm in 0.02 s) is much smaller than that due to swimming, so we assume that the high-frequency noise around the smoothed trajectory is due to the detection system only. Example data series (points) and fits (lines) are shown in Fig. 5(b) for one- and two-color DHM, for a five-second subset of data shown in Fig. 5(a). The data show the cell’s position along the optical axis z(t), omitting every second data point. Furthermore, the data series have been offset in the vertical direction as indicated, for clarity.

The data are more scattered around the fit line in the case of one-color DHM. To further quantify this, we examined the distribution of residuals, i.e. differences between the data and the spline fit across the whole 40 second trajectory. Figures 5(c–e) show histograms of these residuals (points), which are symmetrical and well fitted in all cases by Gaussian distribution functions (thick lines). The widths of the distributions are almost identical in x and y, but the standard deviation is reduced by around one third in the z direction. For clarity, the green channel is omitted from the histograms. The red channel gave better performance, and so offered a more competitive comparison with the dual-color version.

A key advantage of the two-color holography system is the reduction in artifacts associated with the axial intensity gradient in the reconstructed volume. This aspect is of particular use when the object of interest is composed of weakly- and strongly-scattering parts. To demonstrate this property, we imaged a cell of L. mexicana. To simplify the analysis of extended objects (such as the flagellum) in three dimensions, the image stacks were calculated with axial steps of 0.233 μm, so that the three-dimensional voxels in the reconstructed stack are cubic.

Figures 6(a–d) show maximum-intensity projections of gradient stacks obtained from L. mexicana cells, from red, green and combined channels. The cell can be seen to the top left of the image, highlighted by a light blue circle in Fig. 6(a). The circular features in the center of the image are due to a larger cell somewhat further from the focal plane, which gives rise to characteristic circular artifacts throughout the image. Although the flagellum of the cell (the hair-like structure pointing to the top-right of the image) can be seen by eye in this frame, automatic image segmentation is challenging due to the strength of the artifacts caused by the cell body. The shape of these artifacts, particularly their radius, is ultimately determined by the three-dimensional interference patterns of light within the sample. Crucially, the artifacts are located in different positions when different wavelengths are used to illuminate the sample, and cancel when images from two wavelengths are combined, as shown in Fig. 6(c). The brightness has been enhanced by the same factor in Fig 6(a–d), for ease of comparison.

 figure: Fig. 6

Fig. 6 Example data acquired from a subject with heterogeneous scattering properties: a promastigote L. mexicana cell. (a) and (b) show the red and green maximum intensity projection images, respectively (scale bar = 10 μm in each). Panel (c) shows a color image with the red and green channels combined, so that the artifacts can be seen to lie in different positions in each channel. Panel (d) shows the maximum intensity projection of the registered ‘joint’ image (scale bar = 10 μm), and panel (e) shows two orthogonal projections of the cell, demonstrating the three-dimensional reconstruction of the flagellum, the hair-like projection at the top of the cell (scale bar = 2 μm).

Download Full Size | PDF

Figure 6 shows a magnified and re-oriented version of the same cell (note that the scale bar represents 2 μm here, and 10 μm in Fig. 6(a–c), in both top-down and side-on views, demonstrating the fully three-dimensional shape of the flagellum. The brightness of this image has been increased in order to highlight the flagellum, while saturating the cell body (the true length of the body is approximately 6 μm). Both cell body and flagellum can be located independently, by performing successive thresholding operations (data not shown).

6. Conclusions

We have demonstrated that our implementation of two-color DHM has significant advantages over its one-color counterpart, particularly for optically inhomogeneous objects. We have demonstrated this ability by imaging a long, flexible object: the flagellum of a freely-swimming cell of the pathogenic microorganism Leishmania mexicana. DHM imaging of this object is usually hampered by the adjacent cell body that scatters much more light than the flagellum. Our system is relatively easy to implement, particularly in an ‘in-line’ configuration, where mature technologies such as fiber WDM devices can be used to combine lasers in an existing setup. The use of an adapter to allow red and green channels to be imaged on the same sensor is advantageous in that it allows shorter exposure times than those that would be possible on a color sensor, and a simplified analysis. The adapter introduces small distortions as described above, but these are straightforward to correct as described in the calibration step in section 3. We anticipate that this registration protocol may also be of interest to those working in the allied field of quantitative phase imaging. Our main result is that the two-color method can be used to reduce twin-image artifacts, particularly when using localization methods based on the Gouy phase anomaly. In addition to this, we have demonstrated that the final arrangement improves the SNR in tracking the axial position of single cells by around one third. We contend that the continuing development of DHM, through refinements such as the use of multiple wavelengths or tilted illumination, holds great promise for high-speed, quantitative studies of the dynamics of microorganisms.

Funding

Engineering and Physical Sciences Research Council (EPSRC) (EP/N014731/1); Medical Research Council (MRC) (MR/L00092X/1); Wellcome Trust (WT105502MA).

Acknowledgments

The data underlying this paper can be found at doi:10.15124/5d61ce08-7ec5-4981-9980-fd4cc8a1b578

Disclosures

The authors declare that there are no conflicts of interest related to this article.

References and links

1. J. Goodman and R. W. Lawrence, “Digital image formation from electronically detected holograms,” Appl. Phys. Lett. 11, 77–79 (1967). [CrossRef]  

2. U. Schnars and W. Jüptner, “Direct recording of holograms by a ccd target and numerical reconstruction,” Appl. Opt. 33, 179–181 (1994). [CrossRef]   [PubMed]  

3. W. S. Haddad, D. Cullen, J. C. Solem, J. W. Longworth, A. McPherson, K. Boyer, and C. K. Rhodes, “Fourier-transform holographic microscope,” Appl. Opt. 31, 4973–4978 (1992). [CrossRef]   [PubMed]  

4. M. Kim, “Principles and techniques of digital holographic microscopy,” SPIE Rev. 1, 018005 (2010).

5. W. Xu, M. H. Jericho, I. A. Meinertzhagen, and H. J. Kreuzer, “Digital in-line holography for biological applications,” Proc. Natl. Acad. Sci. 98, 11301–11305 (2001). [CrossRef]   [PubMed]  

6. G. DiCaprio, G. Coppola, L. D. Stefano, M. D. Stefano, A. Antonucci, R. Congestri, and E. D. Tommasi, “Shedding light on diatom photonics by means of digital holography,” J. Biophotonics 7, 341–350 (2012). [CrossRef]  

7. B. Kemper and G. von Bally, “Digital holographic microscopy for live cell applications and technical inspection,” Appl. Optics 47, A52–A61 (2008). [CrossRef]  

8. T. Su, L. Xue, and A. Ozcan, “High-throughput lensfree 3d tracking of human sperms reveals rare statistics of helical trajectories,” Proc. Natl. Acad. Sci. 109, 16018–16022 (2012). [CrossRef]   [PubMed]  

9. D. Boss, A. Hoffmann, B. Rappaz, C. Depeursinge, P. J. Magistretti, D. van de Ville, and P. Marquet, “Spatially-resolved eigenmode decomposition of red blood cells membrane fluctuations questions the role of atp in flickering,” PLoS One 7, e40667 (2012). [CrossRef]   [PubMed]  

10. G. DiCaprio, A. El Mallahi, P. Ferraro, R. Dale, G. Coppola, B. Dale, G. Coppola, and F. Dubois, “4d tracking of clinical seminal samples for quantitative characterization of motility parameters,” Biomed. Opt. Express 5, 690–700 (2014). [CrossRef]  

11. H. Park, S.-H. Lee, K. Kim, S.-H. Cho, W.-J. Lee, Y. Kim, S.-E. Lee, and Y. Park, “Characterizations of individual mouse red blood cells parasitized by babesia microti using 3-d holographic microscopy,” Sci. Rep. 5, 10827 (2015). [CrossRef]   [PubMed]  

12. J. F. Jikeli, L. Alvarez, B. M. Friedrich, L. G. Wilson, R. Pascal, R. Colin, M. Pichlo, A. Rennhack, C. Brenker, and U. B. Kaupp, “Sperm navigation along helical paths in 3d chemoattractant landscapes,” Nat. Commun. 6, 7985 (2015). [CrossRef]   [PubMed]  

13. W. Xu, M. H. Jericho, H. J. Kreuzer, and I. A. Meinertzhagen, “Tracking particles in four dimensions with in-line holographic microscopy,” Opt. Lett. 28, 164–166 (2003). [CrossRef]   [PubMed]  

14. J. Garcia-Sucerquia, W. Xu, S. K. Jericho, P. Klages, M. H. Jericho, and H. J. Kreuzer, “Digital in-line holographic microscopy,” Appl. Optics 45, 836–850 (2006). [CrossRef]  

15. J. Sheng, E. Malkiel, J. Katz, J. Adolf, R. Belas, and A. R. Place, “Digital holographic microscopy reveals prey-induced changes in swimming behavior of predatory dinoflagellates,” Proc. Natl. Acad. Sci. 104, 17512–17517 (2007). [CrossRef]   [PubMed]  

16. A. El Mallahi, C. Minetti, and F. Dubois, “Automated three-dimensional detection and classification of living organisms using digital holographic microscopy with partial spatial coherent source: application to the monitoring of drinking water resources,” Appl. Optics 52, A68–A80 (2013). [CrossRef]  

17. L. G. Wilson, L. M. Carter, and S. E. Reece, “High-speed holographic microscopy of malaria parasites reveals ambidextrous flagellar waveforms,” Proc. Natl. Acad. Sci. USA 110, 18769–18774 (2013). [CrossRef]   [PubMed]  

18. K. L. Thornton, R. C. Findlay, P. B. Walrad, and L. G. Wilson, “Investigating the swimming of microbial pathogens using digital holography,” Adv. Exp. Med. Biol. 915, 17–32 (2016). [CrossRef]   [PubMed]  

19. M. H. Jericho, H. J. Kreuzer, M. Kanka, and R. Riesenberg, “Quantitative phase and refractive index measurements with point-source digital in-line holographic microscopy,” Appl. Opt. 51, 1503–1515 (2012). [CrossRef]   [PubMed]  

20. M. Molaei, M. Barry, R. Stocker, and J. Sheng, “Failed escape: Solid surfaces prevent tumbling of escherichia coli,” Phys. Rev. Lett. 113, 068103 (2014). [CrossRef]   [PubMed]  

21. C. B. Giuliano, R. Zhang, and L. G. Wilson, “Digital inline holographic microscopy (dihm) of weakly-scattering subjects,” J. Vis. Exp. 84, e50488 (2014).

22. F. C. Cheong, C. C. Wong, Y. F. Gao, M. H. Nai, Y. Cui, S. Park, L. J. Kenney, and C. T. Lim, “Rapid, high-throughput tracking of bacterial motility in 3d via phase-contrast holographic video microscopy,” Biophys. J. 108, 1248–1256 (2016). [CrossRef]  

23. J. L. Nadeau, Y. B. Cho, J. Kühn, and K. Liewer, “Improved tracking and resolution of bacteria in holographic microscopy using dye and fluorescent protein labeling,” Front. Chem. 4, 17 (2016). [CrossRef]   [PubMed]  

24. A. Wang, R. F. Garmann, and V. N. Manoharan, “Tracking e. coli runs and tumbles with scattering solutions and digital holographic microscopy,” Opt. Express 24, 23719–23725 (2017). [CrossRef]  

25. S. Bianchi, F. Saglimbeni, and R. Di Leonardo, “Holographic imaging reveals the mechanism of wall entrapment in swimming bacteria,” Phys. Rev. X 7, 011010 (2017).

26. F. Charrière, B. Rappaz, J. Kühn, T. Colomb, P. Marquet, and C. Depeursinge, “Influence of shot noise on phase measurement accuracy in digital holographic microscopy,” Opt. Express 15, 8818–8831 (2007). [CrossRef]   [PubMed]  

27. E. Cuche, F. Bevilacqua, and C. Depeursinge, “Digital holography for quantitative phase-contrast imaging,” Opt. Lett. 24, 291–293 (1999). [CrossRef]  

28. I. Yamaguchi, J.-I. Kato, S. Ohta, and J. Mizuno, “Image formation in phase-shifting digital holography and applications to microscopy,” Appl. Optics 40, 6177–6186 (2001). [CrossRef]  

29. J. Kühn, T. Colomb, F. Montfort, F. Charrière, Y. Emery, E. Cuche, P. Marquet, and C. Depeursinge, “Real-time dual-wavelength digital holographic microscopy with a single hologram acquisition,” Opt. Express 15, 7231–7242 (2007). [CrossRef]   [PubMed]  

30. Y. Fu, G. Pedrini, B. M. Hennelly, R. M. Groves, and W. Osten, “Dual-wavelength image-plane digital holography for dynamic measurement,” Opt. Laser. Eng. 47, 552–557 (2009). [CrossRef]  

31. D. G. Abdelsalam and D. Kim, “Real-time dual-wavelength digital holographic microscopy based on polarizing separation,” Opt. Commun. 285, 233–237 (2012). [CrossRef]  

32. S. Lee, Y. Roichman, G.-R. Yi, S.-H. Kim, S.-M. Yang, A. van Blaaderen, P. van Oostrum, and D. G. Grier, “Characterizing and tracking single colloidal particles with video holographic microscopy,” Opt. Express 15, 18275–18282 (2007). [CrossRef]   [PubMed]  

33. J. Fung, K. E. Martin, R. W. Perry, D. M. Katz, R. McGorty, and V. N. Manoharan, “Measuring translational, rotational, and vibrational dynamics in colloids with digital holographic microscopy,” Opt. Express 19, 8051–8065 (2011). [CrossRef]   [PubMed]  

34. S.-H. Lee and D. G. Grier, “Holographic microscopy of holographically trapped three-dimensional structures,” Opt. Express 15, 1505–1512 (2007). [CrossRef]   [PubMed]  

35. F. Saglimbeni, S. Bianchi, A. Lepore, and R. Di Leonardo, “Three-axis digital holographic microscopy for high speed volumetric imaging,” Opt. Express 22, 13710–13718 (2014). [CrossRef]   [PubMed]  

36. J. Sheng, E. Malkiel, and J. Katz, “Digital holographic microscope for measuring three-dimensional particle distributions and motions,” Appl. Opt. 45, 3893–3901 (2006). [CrossRef]   [PubMed]  

37. P. Langehanenberg, B. Kemper, D. Dirksen, and G. von Bally, “Autofocusing in digital holographic phase contrast microscopy on pure phase objects for live cell imaging,” Appl. Opt. 47, D176–D182 (2008). [CrossRef]   [PubMed]  

38. L. M. De Pablos, T. R. Ferreira, and P. B. Walrad, “Developmental differentiation in leishmania lifecycle progression: post-transcriptional control conducts the orchestra,” Curr. Opin. Microbiol. 34, 82–89 (2016). [CrossRef]   [PubMed]  

39. M. Wiese, D. Kuhn, and C. G. Grünfelder, “Protein kinase involved in flagellar-length control,” Eukaryot. Cell 2, 769–777 (2003). [CrossRef]   [PubMed]  

40. C. Gadelha, B. Wickstead, and K. Gull, “Flagellar and ciliary beating in trypanosome motility,” Cell Mot. Cytoskel. 64, 629–643 (2007). [CrossRef]  

41. T. Tahara, T. Kakue, Y. Awatsuji, K. Nishio, S. Ura, T. Kubota, and O. Matoba, “Parallel phase-shifting color digital holographic microscopy,” 3D Research 1, 25 (2010). [CrossRef]  

42. S. O. Isikman, A. Greenbaum, W. Luo, A. F. Coskun, and A. Ozcan, “Giga-pixel lensfree holographic microscopy and tomography using color image sensors,” PLoS One 7, e45044 (2012). [CrossRef]   [PubMed]  

43. Y. Wu, Y. Zhang, W. Luo, and A. Ozcan, “Demosaiced pixel super-resolution for multiplexed holographic color imaging,” Sci. Rep.-UK 6, 28601 (2016). [CrossRef]  

44. H. C. Berg and D. A. Brown, “Chemotaxis in escherichia coli analysed by three-dimensional tracking,” Nature 239, 500 (1972). [CrossRef]   [PubMed]  

45. L. Wilson and R. Zhang, “3d localization of weak scatterers in digital holographic microscopy using rayleigh-sommerfeld back-propagation,” Opt. Express 20, 16735–16744 (2012). [CrossRef]  

46. K. Engel, M. Hadwiger, J. M. Kniss, C. Rezk-Salama, and D. Weiskopf, Real-time volume graphics (A.K. Peters Ltd., 2006). [CrossRef]  

47. C. Ó. S. Sorzano, P. Thévenaz, and M. Unser, “Elastic registration of biological images using vector-spline regularization,” IEEE T. Bio-Med. Eng. 52, 652–663 (2005). [CrossRef]  

48. J. Schindelin, I. Arganda-Carreras, E. Frise, V. Kaynig, M. Longair, T. Pietzsch, S. Preibisch, C. Rueden, S. Saalfeld, B. Schmid, J. Y. Tinevez, D. J. White, V. Hartenstein, K. Eliceiri, P. Tomancal, and A. Cardona, “Fiji: an open-source platform for biological-image analysis,” Nature Met. 9, 676–682 (2012). [CrossRef]  

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (6)

Fig. 1
Fig. 1 Optical layout for our simplified dual wavelength setup. A wavelength division multiplexer (WDM) is used to couple the two illumination wavelengths into a single fiber, which is directed onto a free-space coupler (FSC). An objective lens (OL) and tube lens (TL) produce an image of the sample at the microscope image plane, indicated as a vertical dashed line. The DualView apparatus (DV) images this plane with unit magnification onto a CMOS camera, spatially separating images created by the two illumination sources (see text).
Fig. 2
Fig. 2 Example two-color holographic images of a mixture of E. coli cells and polystyrene beads (see text), showing intermediate processing steps. Panels (a) and (b) show raw holographic data. The images in panels (c) and (d) show the same data, but with the static background removed (see text). The bottom row shows the maximum intensity projection of the intensity gradient stacks Igrad (x, y, z; t); this data can be used either for image registration or for object localization. The scale bars represent 25 μm in all panels.
Fig. 3
Fig. 3 Combining images of a mixture of E. coli cells and polystyrene beads (see text). (a) A superposition of images from red and green channels. The green image is shifted to the right, and suffers a slight stretching in the same direction. (b) A superposition of the red channel with the transformed green channel. The scale bar in panel (b) represents 30 μm. The scale in panel (a) is approximately the same, although red and green channels differ in scale by approximately 1.5% (see text).
Fig. 4
Fig. 4 Signal-to-noise ratio improvements using two-color DHM on a mixture of E. coli cells and polystyrene beads. Panels (a) and (b) show maximum intensity projections of gradient stacks from the red and green channels, respectively. Panel (c) shows the effect of multiplying the channels together. Panels (a–c) are displayed using a linear mapping from black for the minimum value, to white for the maximum value. Panel (d) shows a line profile along the path indicated with a light blue line in panels (a–c). The data in panel (d) have been scaled to have a maximum of 1, and fitted with a Gaussian curve (see text), to demonstrate improvements in the SNR. The scale bars in panels (a–c) represent 25 μm.
Fig. 5
Fig. 5 (a) A three-dimensional reconstruction of the swimming trajectory of bacterial cell, color-coded with instantaneous swimming speed. The total track duration is 40 seconds. (b) Five seconds of data showing the z-position as a function of time from the red channel only (bottom), the green channel only (middle, offset by 2 μm) and using both channels combined (top, offset by 5 μm). The black lines through the data points show the spline-smoothed track used to characterize the localization noise (see text). (c–e) Histograms of residuals when a spline fit is removed from the three-dimensional trajectory shown in (a). The one- and two-color systems achieve similar accuracy in the plane normal to the optical axis (Δx and Δy), but the two-color method reduces uncertainty in the axial coordinate (Δz), shown by the narrower histogram in panel (e).
Fig. 6
Fig. 6 Example data acquired from a subject with heterogeneous scattering properties: a promastigote L. mexicana cell. (a) and (b) show the red and green maximum intensity projection images, respectively (scale bar = 10 μm in each). Panel (c) shows a color image with the red and green channels combined, so that the artifacts can be seen to lie in different positions in each channel. Panel (d) shows the maximum intensity projection of the registered ‘joint’ image (scale bar = 10 μm), and panel (e) shows two orthogonal projections of the cell, demonstrating the three-dimensional reconstruction of the flagellum, the hair-like projection at the top of the cell (scale bar = 2 μm).

Equations (2)

Equations on this page are rendered with MathJax. Learn more.

I g ( x , y , z , ; t ) = I im ( x , y , z ; t ) S z ( x , y , z ) ,
S z ( x , y , 0 ) = ( 1 2 1 2 4 2 1 2 1 ) S z ( x , y , 1 ) = ( 0 0 0 0 0 0 0 0 0 ) S z ( x , y , 2 ) = ( 1 2 1 2 4 2 1 2 1 ) .
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.