Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Scanning color optical tomography (SCOT)

Open Access Open Access

Abstract

We have developed an interferometric optical microscope that provides three-dimensional refractive index map of a specimen by scanning the color of three illumination beams. Our design of the interferometer allows for simultaneous measurement of the scattered fields (both amplitude and phase) of such a complex input beam. By obviating the need for mechanical scanning of the illumination beam or detection objective lens; the proposed method can increase the speed of the optical tomography by orders of magnitude. We demonstrate our method using polystyrene beads of known refractive index value and live cells.

© 2015 Optical Society of America

1. Introduction

Digital Holographic Microscopy (DHM) provides unique opportunities to interrogate biological specimens in their most natural condition with no labeling and minimal phototoxicity. The refractive index is the source of image contrast in DHM, which can be related to the average concentration of non-aqueous contents within the specimen, the so-called dry mass [1–3]. Tomographic interrogation of living biological specimens using DHM can thereby provide the mass of cellular organelles [4] as well as three-dimensional morphology of the subcellular structures within cells and small organisms [5]. There are other label-free approaches [6,7] that rely on non-linear light-matter interactions to provide rich molecular-specific information. Compared to such nonlinear techniques, DHM can significantly reduce damage to the specimens and minimally interferes with the cell physiology, because the scattering cross-section of the linear light-matter interaction is much higher. Therefore, DHM is highly suitable for monitoring morphological developments and metabolism of cells and small organisms over an extended period [8,9]. Recently, DHM was also used to quantify cellular differentiation, which promises an exciting opportunity for regenerative medicine where labeling is often not a viable option [10].

A variety of DHM techniques have been demonstrated to measure both the amplitude and phase alterations of the incident beam due to the interrogated sample [11,12]. In transmission phase measurements, the refractive index of the sample and its height are coupled; the measured phase can be due to variation in either height or refractive index distribution within the sample. One approach to acquire the depth-resolved refractive index map is to record two-dimensional phase maps at multiple illumination angles and use a computational algorithm for the three-dimensional reconstruction. The illumination angle on a sample can be varied either by rotating the sample Fig. 1(a) [13] or by rotating the illumination beam Fig. 1(b) [5,14,15]. To achieve higher stability, common-path adaptations of the angle-scan technique were recently presented [16,17]. Tomographic cell imaging has also been demonstrated using partially-coherent light, where the focal plane of objective lens was axially scanned through the sample Fig. 1(c) [18–21]. It is worth nothing that the speed of data acquisition in these techniques is limited by the mechanical scanning of a mirror or sample stage. Recently, we have demonstrated three-dimensional cell imaging using a focused line beam Fig. 1(d), which allows interrogation of the cells as translated on a mechanical stage [22] or continuously flowing in a microfluidic channel [23].

 figure: Fig. 1

Fig. 1 Tomographic optical microscopy in various configurations for three-dimensional refractive index mapping of a specimen: (a) Rotating-sample geometry [13]. (b) Rotating-beam geometry [5,14,15]. (c) Objective-lens-scan geometry [18–21]. (d) Scanning-sample geometry [22,23], and (e) color-scan geometry proposed in this study. The figure (e) also shows the spatial (X,Y,Z) coordinates used for derivation of the formula.

Download Full Size | PDF

In this paper, we present a unique three-dimensional cell imaging technique that completely obviates the need for scanning optical elements of a microscope. In the proposed method, the depth information is acquired by scanning the wavelength of illumination, which was first proposed in the ultrasound regime [24]. Because scanning the wavelength of a single illumination beam does not provide good depth sectioning, we split the beam into three branches, each of which is collimated and incident onto the specimen at a different angle Fig. 1(e). Both the phase and amplitude images corresponding to the three beams are recorded in a single shot for each wavelength using our novel phase microscope. The wavelength is scanned using an acousto-optic tunable filter whose response time is only tens of microseconds, several orders of magnitude faster than mechanical scanning [25]. In the following sections, we explain the theoretical framework upon which the new technique is founded, and the experimental setup allowing for the measurements required by the theory. Applying our method to polystyrene beads and live biological specimens, we demonstrate the three-dimensional imaging capability of the proposed system for a practically attainable color-scan range of 430 – 630 nm.

2. Scanning color optical tomography

Suppose that we illuminate a sample with a collimated beam at certain wavelength and record the amplitude and phase of the scattered field from the sample. The recorded images will contain the information of the sample’s structure and refractive index. Specifically, the scattered field recorded for a collimated beam of a specific color and illumination angle contains the sample information lying on the Ewald sphere in the spatial frequency space [24]. The arc C in Fig. 2(a) represents a cross-section of the Ewald sphere, where the radius of arc is the inverse of the wavelength and its angular extent is determined by the numerical aperture of the imaging system. Thus by varying the color of illumination, we will retrieve a different portion of the object’s spatial-frequency spectrum. This additional information on the sample effectively improves the three-dimensional imaging capability, as was first explained in the context of ultrasonic tomography [24]. It is important to note that reconstructing the three-dimensional refractive index map of the sample from this mapping in the spatial-frequency space is an ill-posed inverse problem [26], which becomes better posed to solve as we retrieve more regions of the spatial-frequency space by performing additional measurements. One may define the axial resolution of the system as the inverse of the bandwidth or frequency support in the kZ direction. It must be noted that this axial resolution is not necessarily the same for all the transverse frequencies.

 figure: Fig. 2

Fig. 2 Mapping of the two-dimensional scattered fields in the spatial frequency space for different wavelengths and illumination angles: (a) Single-beam illumination. (b) Three-beam illumination (0, + θ and -θ), and (c) continuous scanning of the illumination angle from –θ to + θ. For this simulation, θ = 32° was used. Figures (d)-(f) are X-Z cross-sections of the refractive index map reconstructed from (a)-(c), respectively. The wavelength is varied from 430 to 630 nm. Imaging objective (60X, NA = 1.2). Scale bar is 10 μm.

Download Full Size | PDF

Application of the wavelength-scan or color-scan approach for optical microscopy has been discouraged due to the fact that one needs to scan the wavelength over an impractically broad range to achieve reasonably high depth resolution. For a more quantitative assessment of this argument, we simulated three imaging geometries using a 10-μm polystyrene microsphere (immersed in index matching oil of n = 1.56) as a sample. In the first simulation, we assumed that a single collimated beam was incident onto the sample and the illumination wavelength was swept in the range 430 - 630 nm. Figures 2(a) and 2(d) show the vertical cross-sections of retrieved frequency spectrum and reconstructed tomogram, respectively. The spatial-frequency support in Fig. 2(a) is narrow, which explains the elongated shape of bead along the z-direction in Fig. 2(d). Increasing the wavelength range up to 830 nm did not make a visible improvement on the image. This result is consistent with previous studies [27], which showed that wavelength scan alone does not provide the sufficient coverage in the spatial-frequency space. To get around this limitation, we introduce two additional beams to the sample and change their illumination color at the same time. Figure 2(b) shows a cross-section of the retrieved frequency spectrum when three beams are introduced at the angles of 0, + θ and –θ with respect to the optical axis and their colors are changed simultaneously. The reconstructed bead for this case, shown in Fig. 2(e), converges to the true shape of the bead. To make a comparison with the angle-scan tomography, shown in Fig. 1(b), we further simulated the case where the sample was illuminated with a single beam at 530 nm, the average wavelength, and the angle of illumination was varied from –θ to + θ. As one may expect, the retrieved spatial-frequency spectrum in Fig. 2(c) is now continuous, however, the spatial-frequency coverage in the z-direction is comparable to the case of wavelength scan using three beams. This could be also visually seen from a comparison between the extent of the frequency support in the kZ direction for Fig. 2(b) and 2(c).

Interferometry-based techniques are highly suitable for measuring the complex amplitude (both amplitude and phase) of the scattered field in the visible range. To record the scattered field of the three incident beams, we built an instrument (Fig. 3) adapted from a Diffraction Phase Microscope (DPM) [28]. Specifically, the output of a broadband source (Fianium SC-400) is tuned to a specific wavelength through a high-resolution acousto-optic tunable filter (AOTF) operating in the range from 400 to 700 nm. The filter bandwidth linearly increases with the wavelength, but stays in the range of 2-7 nm; therefore, we may assume that the illumination used for each measurement is quasi-monochromatic. The output of the AOTF is coupled to a single-mode fiber, collimated and used as the input beam to the setup. A Ronchi grating (40 lines/mm, Edmund Optics Inc.) is placed at a plane (IP0) conjugate to the sample plane (SP) in order to generate three beams incident on the sample at different angles. Note that other than the condenser (Olympus 60 X, NA = 0.8) and the collector lens (f3 = 200 mm) we use a 4F imaging system (f1 = 30 mm, f2 = 50 mm) in order to better adjust the beam diameter and the incident angle on the sample. Using a 4F imaging system also facilitates filtering the beams diffracted by the grating so that only three beams enter the condenser lens. Three collimated beams pass through the microscope objective lens (Nikon 60 X, NA = 1.2) and tube lens (f4 = 200 mm) before arriving at the second Ronchi grating (140 lines/mm, Edmund Optics Inc.) placed at the imaging plane (IP2). There is another 4F imaging system between the grating and the CCD camera (f5 = 100 mm, f6 = 300 mm). The focal lengths of the lenses and the grating period are chosen to guarantee that one period of the grating is imaged over approximately four camera pixels [29]. As shown in Fig. 3, there are nine diffracted orders at the Fourier plane, as opposed to three in the conventional DPM. Three columns correspond to three angles of incidence on the sample while three rows are due to diffraction by the grating in the detection arm. We apply a low-pass filter on the strongest diffracted beam at the center to use it as a reference for the interference. The recorded interferogram provides the information of the sample carried by the three beams. Therefore, the recorded interferogram contains three interference patterns, two of which are oblique (I and III) and the third (II) is vertical. While the three patterns may not be distinguishable in the raw interferogram, one can isolate the information for each incident angle in the spatial frequency domain, i.e., by taking the Fourier transform of the interference pattern.

 figure: Fig. 3

Fig. 3 Schematic diagram of the experimental setup: SP (Sample plane); IP (Image plane); FP (Fourier Plane); Condenser lens (Olympus 60 X, NA = 0.8), Objective lens (Nikon 60 X, NA = 1.2). f1 = 30 mm, f2 = 50 mm, f3 = 200 mm, f4 = 200 mm, f5 = 100 mm, and f6 = 300 mm. Non-diffracted order is shown in red while yellow refers to the oblique illumination. Interferograms I, II, and III are decomposition of the raw interferogram (bottom right) into its three components. The physical mask placed at FP2, creates a reference by spatially cleaning the non-diffracted order while passing the 1st order sample beams that correspond to the three incident angles.

Download Full Size | PDF

3. Data analysis and experimental results

We use an illumination consisting of three beams and record two-dimensional scattered fields after the specimen for different wavelengths. For a monochromatic, plane-wave illumination, the scattered field after specimen can be simply related to the scattering potential f of the specimen in the spatial frequency space. To briefly summarize the theoretical background used for data analysis, the scattering potential is defined as

f(r)=k02(n(r)2nm2),
where λ0 is the wavelength in the vacuum, k0 = 2π/λ0 is the wave number in the free space, nm is the refractive index of surrounding medium, and n(r) is the complex refractive index of the sample [30]. Ignoring the polarization effects of the specimen, the complex amplitude us(r) of the scattered field satisfies the following scalar wave equation,
[2+k22]us(r)=f(r)u(r),k=nmk0
where k is the wave number in the medium, and u(r) is the total field, which is recorded by the detector, and represents a sum of the incident and scattered field. A more complete theoretical framework taking the vector nature of light in tomographic techniques into account can be found here [14]. When the specimen is weakly scattering, the two-dimensional Fourier transform of the scattered field recorded in various illumination conditions can be mapped onto the three-dimensional Fourier transform of the scattering potential of the object, as was first proposed by E. Wolf [30]:
F(kX,kY,kZ)=(ikz/π)exp(ikzz)Us(kx,ky;z),kz=k2kx2ky2
where F(kX,kY,kZ) is the three-dimensional Fourier transform of scattering potential f(X,Y,Z), and Us(kx,ky;z) is the two-dimensional Fourier transform of the scattered field us(x,y;z) with respect to x and y. The variables kX, kY, and kZ are the spatial frequency components in the object coordinates, while kx, ky and kz refer to the those in the laboratory frame. Therefore, one needs to map the scattered field of the transmitted waves for each illumination condition to the right position in the three-dimensional Ewald’s sphere discussed earlier,
kX=kxkmxkY=kykmykZ=kzk1mx2my2
where mx and my are functions of the illumination, which are proportional to the location of of the object’s zero frequencies in the object coordinate. In the case of off-axis interferometry these parameters can be readily calculated using the two-dimensional Fourier transform of the scattered waves, see Fig. 4.

 figure: Fig. 4

Fig. 4 Data processing.(a) Two-dimensional Fourier transform of the raw interferogram, whose magnitude is shown in a logarithmic scale of base 10. (b) Cropping the spatial frequencies using masks defined by the physical numerical aperture. The circles are centered at the object zero frequency location corresponding to the illumination angle. Blue dots show the center points. (c)-(e) Phase maps after unwrapping, which correspond to + θ, normal, -θ beams, respectively. Scale bar is10 μm.

Download Full Size | PDF

The original formulation in Eq. (3) was derived using the first-order Born approximation., A. Devaney later extended this formulation to allow for the first-order Rytov approximation as well, which resulted in better reconstruction of optically thick specimens [31]. The Rytov formulation use the same equation, Eq. (3), after substituting the scattered field defined as:

us(r)=u0(r)φs,φs(r)=ln[u(r)/u0(r)].
where u0(x,y;z) is the incident field. The Rytov approximation is valid when the following criterion is met,
nδ>>[kφs]2.
where nδ is the refractive index change within the sample. Therefore, the Rytov approximation is valid when the phase change of propagating light is not large over the wavelength scale. This is in contrast to Born approximation where the total change in the phase itself must be small rather than the phase gradient [24].

In off-axis interferometry adopting a single illumination beam, one can simply extract both the amplitude and phase of the measured field using the Hilbert Transform applied to the raw interferogram [28,32]. In our method adopting three beams, we need to isolate the contribution of each beam prior to retrieving the field information. Figure 4(a) shows the two-dimensional Fourier transform of the raw interferogram that is an autocorrelation of the amplitude leaving the mask in the Fourier plane (FP2). It is interesting to note that unlike the conventional off-axis interferometry where only one diffracted order exists, we have three diffracted beams passing through the physical mask, which create five peaks in the middle of Fig. 4(1) instead of three. The frequency content corresponding to the three beams is initially cropped by a Hanning window Fig. 4(b), the radius of which is determined by the physical numerical aperture of our imaging system. Likewise, a circular mask of the same radius, dotted lines in Fig. 4(b), crops the information corresponding to the oblique illumination beams. The remainder is ascribed to the normal illumination beam. After isolating the frequency content for each illumination angle and shifting the peak, that is the object’s zero frequency, to the origin of spatial-frequency coordinates, one can take the inverse Fourier Transform to retrieve the complex field corresponding to the angle for a specific wavelength. To correct for system aberrations and to separate the scattered fields from the incident field, a set of background images is recorded for an empty region and subtracted from the sample images. These images for background subtraction can be acquired before the measurement, and can be repeatedly used for different samples. Because we record the information of the three illumination angles in a single shot and they all lie within the numerical aperture of the detection objective, there will be a certain degree of overlap for higher spatial-frequency components associated with highly scattered objects. One can potentially account for such a contamination through the use of a priori information about the object and development of a proper algorithm. Implementation of this additional step, however, goes beyond the scope of this specific work. Figure 4(c) through 4(e) show examples of the retrieved phase maps for a hematopoietic stem cell and for the three illumination angles. As explained earlier and in more detail here [33], the complex field information for each incident angle can be mapped onto the object’s spatial-frequency spectrum using Eq. (4). The location of the blue dots relative to the center point determines the parameters mx and my in Eq. (4). As one may expect, since our grating at IP0 diffracts the light only in y-direction, mx = 0 in our setup. Therefore, scanning the color of three beams in our setup provides spatial-frequency coverage similar to scanning the angle of illumination along one direction in angle-scan tomography. This means we may have different resolution in X and Y. For the symmetrical sample used here, however, the X-Z and Y-Z cross sections are similar. After completing the mapping, one can obtain the refractive index map through the three-dimensional inverse Fourier transform.

To make a comparison between simulation and experimental data, we conducted experiments on polystyrene microspheres immersed in index matching oil (n = 1.56 at 589.3 nm) in accordance with the simulations done earlier in the previous section. Here it must be noted that in our instrument the illumination angle varies slightly for different wavelengths because of the dispersive nature of the diffraction grating used for creating three illumination beams. The simulation conducted earlier in section 2 reflects this fact where illumination angle has been varied linearly from 24 to 38 degrees with θ = 32° being the average illumination angle. Figure 5(a) shows a vertical cross-section of the spatial-frequency spectrum after completing the mapping of experimental data, which matches well with our simulation; see Fig. 2(b). Figures 5(b) and 5(c) show vertical and horizontal cross-sections, respectively, of the reconstructed tomogram. Optical tomography using a scanning-angle or scanning-objective-lens geometry suffers from the missing cone problem, namely data in the cone-shaped region near the origin of frequency coordinates is not collected, similar to wide-field fluorescence microscopy. Because of this missing information, the reconstructed map is elongated along the optical axis direction and the refractive index value is underestimated. Using a priori information of the object such as non-negativity, we can fill the missing region, suppress the artifacts and improve the quality of reconstruction [33]. Figures 5(e) and 5(f) show vertical and horizontal cross-sections, respectively, of the reconstructed bead after 25 iterations of our regularization algorithm, using non-negativity constraint. In these images, the negative bias around the sample is eliminated and the refractive index value (1.585 ± 0.001) converges to the value obtained with angle-scan tomography and that provided by the manufacturer. It is worth noting that the current algorithm does not compensate for the material dispersion; therefore, the obtained refractive index value is for the mean wavelength. In addition to the non-negativity constraint, we can also use the piecewise-smoothness constraint for further improvement at the cost of additional computation time [34].

 figure: Fig. 5

Fig. 5 Spatial-frequency mapping and reconstructed tomogram of a 10 μm Polystyrene bead before/after regularization. (a) Vertical cross-section of the spatial-frequency spectrum before regularization. (b), (c) Reconstructed images in X-Z and X-Y planes, respectively, before regularization. (d) Vertical cross-section of the spatial-frequency spectrum after regularization. (e), (f) Reconstructed images of the bead in X-Z and X-Y planes after regularization. Scale bar is 10 μm.

Download Full Size | PDF

To further demonstrate our method on a biological specimen, we imaged a hematopoietic stem cell (HSC). Study of the morphology and structure of HSCs in a label-free fashion is of particular interest, since labeling would interfere with the differentiation of HSCs and labeled cells cannot be used for therapeutic purposes. A promising result was recently reported, where Chalut et al. demonstrated the use of physical phenotypes acquired with DHM to quantify cellular differentiation of myeloid precursor cells [35]. Figures 6(a) through 6(c) show horizontal cross-sections of the specimen at 2 μm intervals after tomographic reconstruction. The images show varying internal structures at different heights, which clearly demonstrate three-dimensional imaging capability of the proposed method with high axial resolution. As can be seen in Fig. 6(d), these internal structures are not visible in the cumulative phase map or in a single –shot phase image. Figures 6(e) and 6(f) show three-dimensional rendering of the refractive index map at two different angles.

 figure: Fig. 6

Fig. 6 Three-dimensional refractive index map of a hematopoietic stem cell: (a)-(c) Horizontal cross-sections of the tomogram at 2 μm intervals. d) Cumulative phase map of the whole cell. (e)-(f) Three-dimensional rendering of the cell using the measured refractive index map, which shows various internal structures. Scale bar is 10 μm.

Download Full Size | PDF

4. Discussion

For a monochromatic plane wave incident onto a sample, the recorded scattered wave provides only a portion of the spatial-frequency spectrum for the object. Therefore, coherent imaging of complex three-dimensional structures within a specimen requires tomographic data acquisition, namely recording multiple images while varying illumination angles. This is in contrast to the topographic surface measurement [36] or locating a finite number of the scatterers in the three-dimensional space [37], where single-shot imaging may be sufficient. When a light source with reduced spatial coherence [19], temporal coherence, or both [20] is used, the retrieved spatial-frequency spectrum is much broader, but the depth-of-field is significantly reduced. Therefore, one needs to scan the objective focus through the sample to acquire three-dimensional volume data. In this paper, we reported a high-resolution optical microscopy technique using structured illumination of coherent light and wavelength scanning to increase the axial resolution. The idea of changing of the source frequency for better axial resolution was originally proposed decades ago in ultrasonic tomography [24]. However, application of the idea for high-resolution optical microscopy has been discouraged on the grounds that the frequency coverage achieved through wavelength scanning is not sufficient [27]. Our hybrid approach resolves this problem by combining three illumination angles with wavelength scanning to acquire high axial resolution for a practically attainable wavelength range. Using an experimental setup extending the concept of DPM, we recorded the complex amplitude of the scattered field for three illumination beams for each wavelength.

In existing tomographic microcopy, scanning mirrors or moving the focus of objective lens is a barrier for increasing the speed of data acquisition. The method we propose scans the wavelength through an acousto-optic tunable filter, whose transition time is only tens of microseconds. By introducing three beams simultaneously, we can reduce the sampling number by a factor of three as well. In this initial work, we record tomograms in about one second due to limited wavelength switching speed afforded by the manufacturer software, however, this limit can be pushed by orders of magnitude by implementing proper hardware control algorithm and a faster imaging device. We can thereby acquire interferograms required for a single tomogram within milliseconds using a high-speed camera. We also note that the methods included in this report do not correct for material dispersion, which may become noticeable for certain biological specimens in the visible region. For example, the refractive index of the hemoglobin in red blood cells has a strong wavelength dependence in the 355-500 nm range [38]. Nucleic acids have a clearly distinct refractive index profile from proteins at the wavelength close to their absorption peaks [39]. Our method collects images in a broad range of wavelength, each of which contains both material dispersion information as well as structural information of the sample. Extracting material dispersion of the cellular organelles, which is left for a future study, could provide rich molecular-specific information, e.g., relative amount of nucleic acids and proteins, at a subcellular level [4].

Acknowledgments

This work was mainly supported by NIH 9P41EB015871-26A1 and 1R01HL121386-01A1. Peter T.C. So was supported by 5P41EB015871-30, 5R01HL121386-03, 2R01EY017656-06A1, 1U01NS090438-01, 1R21NS091982-01, 5R01NS051320, DP3DK101024-01, 4R44EB012415-02, NSF CBET-0939511, the Singapore-MIT Alliance for Science and Technology Center, the MIT SkolTech initiative, the Hamamatsu Corp. and the Koch Institute for Integrative Cancer Research Bridge Project Initiative. The authors would like to thank Professor George Barbastathis for helpful discussion, and Dr. Jeon Woong Kang for sample preparation and feedback on the manuscript.

References and links

1. R. Barer, “Refractometry and interferometry of living cells,” J. Opt. Soc. Am. 47(6), 545–556 (1957). [CrossRef]   [PubMed]  

2. G. Popescu, Y. Park, N. Lue, C. Best-Popescu, L. Deflores, R. R. Dasari, M. S. Feld, and K. Badizadegan, “Optical imaging of cell mass and growth dynamics,” Am. J. Physiol. Cell Physiol. 295(2), C538–C544 (2008). [CrossRef]   [PubMed]  

3. Y. Sung, A. Tzur, S. Oh, W. Choi, V. Li, R. R. Dasari, Z. Yaqoob, and M. W. Kirschner, “Size homeostasis in adherent cells studied by synthetic phase microscopy,” Proc. Natl. Acad. Sci. U.S.A. 110(41), 16687–16692 (2013). [CrossRef]   [PubMed]  

4. Y. Sung, W. Choi, N. Lue, R. R. Dasari, and Z. Yaqoob, “Stain-free quantification of chromosomes in live cells using regularized tomographic phase microscopy,” PLoS One 7(11), e49502 (2012). [CrossRef]   [PubMed]  

5. W. Choi, C. Fang-Yen, K. Badizadegan, S. Oh, N. Lue, R. R. Dasari, and M. S. Feld, “Tomographic phase microscopy,” Nat. Methods 4(9), 717–719 (2007). [CrossRef]   [PubMed]  

6. C. W. Freudiger, W. Min, B. G. Saar, S. Lu, G. R. Holtom, C. He, J. C. Tsai, J. X. Kang, and X. S. Xie, “Label-free biomedical imaging with high sensitivity by stimulated Raman scattering microscopy,” Science 322(5909), 1857–1861 (2008). [CrossRef]   [PubMed]  

7. L. Tong, Y. Liu, B. D. Dolash, Y. Jung, M. N. Slipchenko, D. E. Bergstrom, and J.-X. Cheng, “Label-free imaging of semiconducting and metallic carbon nanotubes in cells and mice using transient absorption microscopy,” Nat. Nanotechnol. 7(1), 56–61 (2011). [CrossRef]   [PubMed]  

8. M. Mir, Z. Wang, Z. Shen, M. Bednarz, R. Bashir, I. Golding, S. G. Prasanth, and G. Popescu, “Optical measurement of cycle-dependent cell growth,” Proc. Natl. Acad. Sci. U.S.A. 108(32), 13124–13129 (2011). [CrossRef]   [PubMed]  

9. K. L. Cooper, S. Oh, Y. Sung, R. R. Dasari, M. W. Kirschner, and C. J. Tabin, “Multiple phases of chondrocyte enlargement underlie differences in skeletal proportions,” Nature 495(7441), 375–378 (2013). [CrossRef]   [PubMed]  

10. K. J. Chalut, K. Kulangara, A. Wax, and K. W. Leong, “Stem cell differentiation indicated by noninvasive photonic characterization and fractal analysis of subcellular architecture,” Integr Biol (Camb) 3(8), 863–867 (2011). [CrossRef]   [PubMed]  

11. K. Creath, “Phase-measurement interferometry techniques,” Prog. Opt. 26, 349–393 (1988). [CrossRef]  

12. M. Reed Teague, “Deterministic phase retrieval: a Green’s function solution,” J. Opt. Soc. Am. 73(11), 1434–1441 (1983). [CrossRef]  

13. F. Charrière, A. Marian, F. Montfort, J. Kuehn, T. Colomb, E. Cuche, P. Marquet, and C. Depeursinge, “Cell refractive index tomography by digital holographic microscopy,” Opt. Lett. 31(2), 178–180 (2006). [CrossRef]   [PubMed]  

14. V. Lauer, “New approach to optical diffraction tomography yielding a vector equation of diffraction tomography and a novel tomographic microscope,” J. Microsc. 205(2), 165–176 (2002). [CrossRef]   [PubMed]  

15. S. O. Isikman, W. Bishara, S. Mavandadi, F. W. Yu, S. Feng, R. Lau, and A. Ozcan, “Lens-free optical tomographic microscope with a large imaging volume on a chip,” Proc. Natl. Acad. Sci. U.S.A. 108(18), 7296–7301 (2011). [CrossRef]   [PubMed]  

16. K. Kim, Z. Yaqoob, K. Lee, J. W. Kang, Y. Choi, P. Hosseini, P. T. So, and Y. Park, “Diffraction optical tomography using a quantitative phase imaging unit,” Opt. Lett. 39(24), 6935–6938 (2014). [CrossRef]   [PubMed]  

17. W.-C. Hsu, J.-W. Su, T.-Y. Tseng, and K.-B. Sung, “Tomographic diffractive microscopy of living cells based on a common-path configuration,” Opt. Lett. 39(7), 2210–2213 (2014). [CrossRef]   [PubMed]  

18. P. Bon, S. Aknoun, J. Savatier, B. Wattellier, and S. Monneret, “Tomographic incoherent phase imaging, a diffraction tomography alternative for any white-light microscope,” Proc. SPIE 8589, 858918 (2013). [CrossRef]  

19. P. Bon, S. Aknoun, S. Monneret, and B. Wattellier, “Enhanced 3D spatial resolution in quantitative phase microscopy using spatially incoherent illumination,” Opt. Express 22(7), 8654–8671 (2014). [CrossRef]   [PubMed]  

20. T. Kim, R. Zhou, M. Mir, S. D. Babacan, P. S. Carney, L. L. Goddard, and G. Popescu, “White-light diffraction tomography of unlabelled live cells,” Nat. Photonics 8(3), 256–263 (2014). [CrossRef]  

21. R. Fiolka, K. Wicker, R. Heintzmann, and A. Stemmer, “Simplified approach to diffraction tomography in optical microscopy,” Opt. Express 17(15), 12407–12417 (2009). [CrossRef]   [PubMed]  

22. N. Lue, W. Choi, G. Popescu, K. Badizadegan, R. R. Dasari, and M. S. Feld, “Synthetic aperture tomographic phase microscopy for 3D imaging of live cells in translational motion,” Opt. Express 16(20), 16240–16246 (2008). [CrossRef]   [PubMed]  

23. Y. Sung, N. Lue, B. Hamza, J. Martel, D. Irimia, R. R. Dasari, W. Choi, Z. Yaqoob, and P. So, “Three-dimensional holographic refractive-index measurement of continuously flowing cells in a microfluidic channel,” Phys. Rev. Appl. 1(1), 014002 (2014). [CrossRef]   [PubMed]  

24. A. C. Kak and M. Slaney, Principles of Computerized Tomographic Imaging (Siam, 1988), Vol. 33.

25. C. Fang-Yen, W. Choi, Y. Sung, C. J. Holbrow, R. R. Dasari, and M. S. Feld, “Video-rate tomographic phase microscopy,” J. Biomed. Opt. 16(1), 011005 (2011). [CrossRef]   [PubMed]  

26. J. Hadamard, Lectures on Cauchy’s Problem in Linear Partial Differential Equations (Courier Corporation, 2014).

27. R. Dändliker and K. Weiss, “Reconstruction of the three-dimensional refractive index from scattered waves,” Opt. Commun. 1(7), 323–328 (1970). [CrossRef]  

28. G. Popescu, T. Ikeda, R. R. Dasari, and M. S. Feld, “Diffraction phase microscopy for quantifying cell structure and dynamics,” Opt. Lett. 31(6), 775–777 (2006). [CrossRef]   [PubMed]  

29. G. Popescu, Quantitative Phase Imaging of Cells and Tissues (McGraw Hill Professional, 2011).

30. E. Wolf, “Three-dimensional structure determination of semi-transparent objects from holographic data,” Opt. Commun. 1(4), 153–156 (1969). [CrossRef]  

31. A. J. Devaney, “Inverse-scattering theory within the Rytov approximation,” Opt. Lett. 6(8), 374–376 (1981). [CrossRef]   [PubMed]  

32. T. Ikeda, G. Popescu, R. R. Dasari, and M. S. Feld, “Hilbert phase microscopy for investigating fast dynamics in transparent systems,” Opt. Lett. 30(10), 1165–1167 (2005). [CrossRef]   [PubMed]  

33. Y. Sung, W. Choi, C. Fang-Yen, K. Badizadegan, R. R. Dasari, and M. S. Feld, “Optical diffraction tomography for high resolution live cell imaging,” Opt. Express 17(1), 266–277 (2009). [CrossRef]   [PubMed]  

34. Y. Sung and R. R. Dasari, “Deterministic regularization of three-dimensional optical diffraction tomography,” J. Opt. Soc. Am. A 28(8), 1554–1561 (2011). [CrossRef]   [PubMed]  

35. K. J. Chalut, A. E. Ekpenyong, W. L. Clegg, I. C. Melhuish, and J. Guck, “Quantifying cellular differentiation by physical phenotype using digital holographic microscopy,” Integr Biol (Camb) 4(3), 280–284 (2012). [CrossRef]   [PubMed]  

36. S. Zhang, “Recent progresses on real-time 3D shape measurement using digital fringe projection techniques,” Opt. Lasers Eng. 48(2), 149–158 (2010). [CrossRef]  

37. D. J. Brady, K. Choi, D. L. Marks, R. Horisaki, and S. Lim, “Compressive holography,” Opt. Express 17(15), 13040–13049 (2009). [CrossRef]   [PubMed]  

38. M. Friebel and M. Meinke, “Model function to calculate the refractive index of native hemoglobin in the wavelength range of 250-1100 nm dependent on concentration,” Appl. Opt. 45(12), 2838–2842 (2006). [CrossRef]   [PubMed]  

39. D. Fu, F.-K. Lu, X. Zhang, C. Freudiger, D. R. Pernik, G. Holtom, and X. S. Xie, “Quantitative chemical imaging with multiplex stimulated Raman scattering microscopy,” J. Am. Chem. Soc. 134(8), 3623–3626 (2012). [CrossRef]   [PubMed]  

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (6)

Fig. 1
Fig. 1 Tomographic optical microscopy in various configurations for three-dimensional refractive index mapping of a specimen: (a) Rotating-sample geometry [13]. (b) Rotating-beam geometry [5,14,15]. (c) Objective-lens-scan geometry [18–21]. (d) Scanning-sample geometry [22,23], and (e) color-scan geometry proposed in this study. The figure (e) also shows the spatial (X,Y,Z) coordinates used for derivation of the formula.
Fig. 2
Fig. 2 Mapping of the two-dimensional scattered fields in the spatial frequency space for different wavelengths and illumination angles: (a) Single-beam illumination. (b) Three-beam illumination (0, + θ and -θ), and (c) continuous scanning of the illumination angle from –θ to + θ. For this simulation, θ = 32° was used. Figures (d)-(f) are X-Z cross-sections of the refractive index map reconstructed from (a)-(c), respectively. The wavelength is varied from 430 to 630 nm. Imaging objective (60X, NA = 1.2). Scale bar is 10 μm.
Fig. 3
Fig. 3 Schematic diagram of the experimental setup: SP (Sample plane); IP (Image plane); FP (Fourier Plane); Condenser lens (Olympus 60 X, NA = 0.8), Objective lens (Nikon 60 X, NA = 1.2). f1 = 30 mm, f2 = 50 mm, f3 = 200 mm, f4 = 200 mm, f5 = 100 mm, and f6 = 300 mm. Non-diffracted order is shown in red while yellow refers to the oblique illumination. Interferograms I, II, and III are decomposition of the raw interferogram (bottom right) into its three components. The physical mask placed at FP2, creates a reference by spatially cleaning the non-diffracted order while passing the 1st order sample beams that correspond to the three incident angles.
Fig. 4
Fig. 4 Data processing. (a) Two-dimensional Fourier transform of the raw interferogram, whose magnitude is shown in a logarithmic scale of base 10. (b) Cropping the spatial frequencies using masks defined by the physical numerical aperture. The circles are centered at the object zero frequency location corresponding to the illumination angle. Blue dots show the center points. (c)-(e) Phase maps after unwrapping, which correspond to + θ, normal, -θ beams, respectively. Scale bar is10 μm.
Fig. 5
Fig. 5 Spatial-frequency mapping and reconstructed tomogram of a 10 μm Polystyrene bead before/after regularization. (a) Vertical cross-section of the spatial-frequency spectrum before regularization. (b), (c) Reconstructed images in X-Z and X-Y planes, respectively, before regularization. (d) Vertical cross-section of the spatial-frequency spectrum after regularization. (e), (f) Reconstructed images of the bead in X-Z and X-Y planes after regularization. Scale bar is 10 μm.
Fig. 6
Fig. 6 Three-dimensional refractive index map of a hematopoietic stem cell: (a)-(c) Horizontal cross-sections of the tomogram at 2 μm intervals. d) Cumulative phase map of the whole cell. (e)-(f) Three-dimensional rendering of the cell using the measured refractive index map, which shows various internal structures. Scale bar is 10 μm.

Equations (6)

Equations on this page are rendered with MathJax. Learn more.

f( r )= k 0 2 ( n ( r ) 2 n m 2 ),
[ 2 + k 2 2 ] u s ( r )=f( r )u( r ),k= n m k 0
F( k X , k Y , k Z )=( i k z /π )exp( i k z z ) U s ( k x , k y ;z ), k z = k 2 k x 2 k y 2
k X = k x k m x k Y = k y k m y k Z = k z k 1 m x 2 m y 2
u s ( r )= u 0 ( r ) φ s , φ s ( r )=ln[ u( r ) / u 0 ( r ) ].
n δ >> [ k φ s ] 2 .
Select as filters


    Select Topics Cancel
    © Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.