Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Diffraction tomography with Fourier ptychography

Open Access Open Access

Abstract

This paper presents a technique to image the complex index of refraction of a sample across three dimensions. The only required hardware is a standard microscope and an array of LEDs. The method, termed Fourier ptychographic tomography (FPT), first captures a sequence of intensity-only images of a sample under angularly varying illumination. Then, using principles from ptychography and diffraction tomography, it computationally solves for the sample structure in three dimensions. The experimental microscope demonstrates a lateral spatial resolution of 0.39 μm and an axial resolution of 3.7 μm at the Nyquist–Shannon sampling limit (0.54 and 5.0 μm at the Sparrow limit, respectively) across a total imaging depth of 110 μm. Unlike competing methods, this technique quantitatively measures the volumetric refractive index of primarily transparent and contiguous sample features without the need for interferometry or any moving parts. Wide field-of-view reconstructions of thick biological specimens suggest potential applications in pathology and developmental biology.

© 2016 Optical Society of America

1. INTRODUCTION

It is challenging to image thick samples with a standard microscope. High-resolution objective lenses offer a shallow depth of field, which require one to axially scan through the sample to visualize its three-dimensional (3D) shape. Unfortunately, refocusing does not remove light from areas above and below the plane of interest. This longstanding problem has inspired a number of solutions, the most widespread being confocal designs, two-photon excitation methods, light sheet microscopy, and optical coherence tomography. These methods “gate out” light from sample areas away from the point of interest and offer excellent signal enhancement, especially for thick, fluorescent samples [1].

Such gating techniques also encounter several problems. First, they typically must scan out each image, which might require physical movement and can be time consuming. Second, the available signal (i.e., the number of ballistic photons) decreases exponentially with depth. To overcome this limit, one must use a high NA lens, which provides a proportionally smaller image field of view (FOV). Finally, little light is backscattered when imaging non-fluorescent samples that are primarily transparent, such as commonly seen in embryology, in model organisms such as zebrafish, and after the application of recent tissue clearing [2] and expansion [3] techniques.

Instead of capturing just the ballistic photons emerging from the sample, one might instead image the entire optical field, which includes light that has scattered. Several techniques have been proposed to enable depth selectivity without gating for ballistic light. One might perform optical sectioning through digital deconvolution of a focal stack [4]. Light-field imaging [5] and point-spread function engineering [6] are two other alternatives. All three of these methods primarily operate with incoherent light, e.g., from fluorescent samples. They are thus not ideal tools for obtaining the refractive index distribution of a primarily transparent and non-fluorescent medium.

To do so, it is useful to use coherent illumination. For example, the amplitude and phase of a digital hologram may be computationally propagated to different depths within a thick sample, much like refocusing a microscope. However, the field at out-of-focus planes still influences the final result. Several techniques also aim for depth selectivity by using quasi-coherent illumination or through acquiring multiple images [710].

A very useful framework to summarize how coherent light scatters through thick samples is diffraction tomography (DT), as first developed by Wolf [11]. In a typical DT experiment, one illuminates a sample of interest with a series of tilted plane waves and measures the resulting complex diffraction patterns in the far field. These measurements may then be combined with a suitable algorithm into a tomographic reconstruction. An early demonstration of DT by Lauer is a good example [12]. Typically, the reconstruction algorithm assumes the first Born [13,14] or the first Rytov [15] approximation. It is also possible to apply the projection approximation, which models light as a ray. As a synthetic aperture technique, DT comes with the additional benefit of improving the resolution of an imaging element beyond its traditional diffraction-limit cutoff [12].

However, as a technique that models both the amplitude and phase of a coherent field, most implementations of DT require a reference beam and holographic measurement, or some sort of phase-stable interference (including spatial light modulator coding strategies, e.g., as in Ref. [16]). Since it is critical to control for interferometric stability [13] and thus limit the motion and phase drift to sub-micrometer variations, DT has been primarily implemented in well-controlled, customized setups. Several prior works have considered solving DT from intensity-only measurements to possibly remove the need for a reference beam [1726]. However, while some applied the first Born approximation, these works also required customized setups and typically imposed additional sample constraints (e.g., a known sample support). They did not operate within a standard microscope or connect their reconstruction algorithms to ptychography, from which improvements like computational aberration correction [27] and multiplexed reconstruction [28] may be easily adopted.

Here, we perform DT using standard intensity images captured under variable LED illumination from an array source. Our technique, termed Fourier ptychographic tomography (FPT), acquires a sequence of images while changing the light pattern displayed on the LED array. Then, it combines these images using a phase retrieval-based ptychographic reconstruction algorithm, which computationally segments a thick sample into multiple planes, as opposed to physically rejecting light from above and below one plane of interest. Similar to DT, FPT also improves the lateral image resolution beyond the standard cutoff of the imaging lens. The end result is an accurate three-dimensional map of the complex index of refraction of a volumetric sample obtained directly from a sequence of standard microscope images.

2. RELATED WORK

To begin, a number of techniques attempt 3D imaging without applying the first Born approximation. These include lensless on-chip devices [29], lensless setups that assume an appropriate linearization [30], and methods relying upon effects like defocusing (e.g., the transport of intensity equation [31]) or spectral variations [32]. These techniques do not necessarily fit within a standard microscope setup or offer the ability to simultaneously improve spatial resolution. Two related works for 3D imaging, which also do not use DT with the first Born approximation, are by Tian and Waller [33] and Li et al. [34]. These two setups are quite similar to ours, and we discuss their operations in more detail below.

As mentioned above, there are also several prior works applying DT under the first Born approximation that only use intensity measurements [1726]. While some of these works examine phase retrieval as a reconstruction algorithm, they must either shift the focal plane [22,24] or source [25] axially between each measurement, or must assume constraints on the sample [19,20,23,26] to successfully recover the phase. These prior works do not connect DT to ptychographic phase retrieval (i.e., they do not recover the phase by using diversity between the variably illuminated DT images). Connections between phase retrieval and DT under the first Born approximation have also been explored within the context of volume hologram design [35].

The field of x ray ptychography also offers a number of methods to image 3D samples with intensity measurements [3638]. However, none of these ptychography methods seem to directly modify DT under the first Born or Rytov approximation, to the best of our knowledge. A popular technique appears to use standard two-dimensional (2D) ptychographic solvers to determine the complex field for individual projections of a slowly rotated sample, which are subsequently combined using conventional DT techniques [39].

Fourier ptychography (FP) [40] uses a standard microscope and no moving parts to simultaneously improve image resolution and measure quantitative phase but is restricted to thin samples. FPT effectively extends FP into the third dimension. As noted above, Tian and Waller [33] and Li et al. [34] also examine the problem of 3D imaging from intensities in a standard microscope. These two examples adopted their reconstruction technique from a 3D ptychography method [37,38] that splits up the sample into a specified number of infinitesimally thin slices (each under the projection approximation) and applies the beam propagation method (i.e., assumes small-angle scattering) [41]. This “multi-slice” approach remains accurate within a different domain of optical scattering than the first Born approximation (i.e., for different types of samples and setups, see chapter 2 in Ref. [42]). For example, one may include both forward and backscattered light in a DT solver to accurately reconstruct a 3D sample under the Born approximation [12]. However, the multi-slice method does not directly account for backscattered light. Its projection approximation also assumes the lateral divergence of the optical field gradient at each slice is zero. Alternatively, the validity of first Born approximation breaks down when the total amount of absorption and phase shift from a sample is large [43], whereas this appears to impact the multi-slice approach less (e.g., it can image two highly absorbing layers separated by a finite amount of free space [33,34]).

Thus, while the experimental setup of FPT is similar to prior work [33,34,40], our reinterpretation of ptychography within the physical framework of DT (under the first Born approximation) allows us to accurately reconstruct new specimen types in 3D without measuring phase. For example, primarily transparent samples of continuously varying optical density, which are often encountered in biology, typically obey the first Born approximation. Accordingly, we have used FPT to compute some of the first quantitatively accurate 3D maps of clear samples that contain contiguous features (e.g., an unstained nematode parasite and starfish embryo) from standard microscope images.

In addition, FPT offers a clear picture of the location and amount of data it captures in 3D Fourier space. Such knowledge currently helps us to establish various solution guarantees for 2D image phase retrieval [44]. These guarantees may also extend to the current case of 3D tomographic phase retrieval (e.g., to support its potential clinical use). Furthermore, instead of specifying an arbitrary number of sample slices and their location in a 3D volume, FPT simply inserts measured data into its appropriate 3D Fourier space location and ensures phase consistency between each measurement. By solving for the first term in the Born expansion, we aim this approach as a general framework to eventually form quantitatively accurate tomographic maps of complex biological samples with sub-micrometer resolutions.

3. METHOD OF FPT

In this section, we develop a mathematical expression for our image measurements using the FPT framework and then summarize our reconstruction algorithm. We use the vector r=(rx,ry,rz) to define the 3D sample coordinates and the vector k=(kx,ky,kz) to define the corresponding k-space (wave vector) coordinates (see Fig. 1).

 figure: Fig. 1.

Fig. 1. Setup for Fourier ptychographic tomography (FPT). (a) Labeled diagram of the FPT microscope, including optical functions of interest. (b) FPT captures multiple images under varied LED illumination. (c) A ptychography-inspired algorithm combines these images in a 3D k-space representation of the complex sample. (d) FPT outputs a 3D tomographic map of the complex index of refraction of the sample. Included images are experimental measurements from a starfish embryo (real index component, threshold applied; see Fig. 7).

Download Full Size | PDF

A. Image Formation in FPT

It is helpful to begin our discussion by introducing a quantity termed the scattering potential, which contains the complex index of refraction of an arbitrarily thick volumetric sample,

V(r)=k4π(n2(r)nb2).

Here, n(r) is the spatially varying and complex refractive index profile of the sample, nb is the index of refraction of the background (which we assume is constant), and k=2π/λ is the wavenumber in vacuum. We note that n(r)=nr(r)+1i·nim(r), where nr is associated with the sample’s refractive index, nim is associated with its absorptivity, and we define 1i=1 for notational clarity. We typically neglect the dependence of n on λ since we illuminate with quasi-monochromatic light. This dependence cannot be neglected when imaging with polychromatic light. Finally, we use the term “thick” for samples that do not obey the thin sample approximation, which requires the sample thickness to be much less than 2/kθmax2, where θmax is the magnitude of the maximum scattering angle [45].

Next, to understand what happens to light when it passes through this volumetric sample, we define the complex field that results from illuminating the thick sample, U(r), as a sum of two fields: U(r)=Ui(r)+Us(r). Here, Ui(r) is the field “incident” upon the sample (i.e., from one LED) and Us(r) is the resulting field that “scatters” off of the sample. We may insert this decomposition into the scalar wave equation for light propagating through an inhomogeneous medium and use Green’s theorem to determine the scattered field as [11]

Us(r)=G(|rr|)V(r)U(r)dr.

Here, G(|rr|) is the Green’s function connecting light scattered from various sample locations, denoted by r, to an arbitrary location r. V(r) is the scattering potential from Eq. (1). Since U(r) is unknown at all sample locations, it is challenging to solve Eq. (2). Instead, it is helpful to apply the first Born approximation, which replaces U(r) in the integrand with Ui(r). This approximation assumes that Ui(r)Us(r). It is the first term in the Born expansion that describes the scattering response of an arbitrary sample [11]. It assumes a weakly scattering medium. Specifically, the first Born approximation remains valid when the relative index shift δn=|n(r)nb| and sample thickness t obey the relation, ktδn/21 [43]. We expect FPT to remain quantitatively accurate with samples obeying this condition. By including higher-order terms, the above framework may in principle include samples with stronger scattering [46,47].

Our system sequentially illuminates the sample with an LED array, which contains q=qx×qy sources positioned a large distance l from the sample (in a uniform grid, with inter-LED spacing c; see Fig. 1). It is helpful to label each LED with a 2D counter variable (jx,jy), where qx/2jxqx/2 and qy/2jyqy/2, as well as a single counter variable j, where 1jq. Assuming each LED acts as a spatially coherent and quasi-monochromatic source (central wavelength λ) placed at a large distance from the sample, the incident field takes the form of a plane wave traveling at a variable angle such that θjx=tan1(jx·c/l) and θjy=tan1(jy·c/l) with respect to the x- and y-axes, respectively. We may express the jth field incident upon the sample as

Ui(j)(r)=exp(1ikj·r),
where kj is the wave vector of the jth LED plane wave,
kj=(kjx,kjy,kjz)=k(sinθjx,sinθjy,1sin2θjxsin2θjy).

As θjx and θjy vary, kj will always assume values along a spherical shell in 3D (kx,ky,kz) space (i.e., the Ewald sphere), since the value of kjz is a deterministic function of kjx and kjy.

After replacing U(r) in Eq. (2) with Ui(j)(r) from Eq. (3) and additionally approximating the Green’s function G as a far-field response, the following relationship emerges between the scattering potential V and the Fourier transform of the jth scattered field, U^s(j)(k), in the far field [11]:

U^s(j)(k)=V^(kkj).

We refer to V^(k) as the k-space scattering potential, which is the three-dimensional Fourier transform of V(r), with k the scattered wave vector in the far field. For simplicity, we have left out a multiplicative pre-factor (1iπ/kz) on the right-hand side of Eq. (5), and instead assume it is included within the function V^ for the remainder of this presentation. The field scattered by the sample and viewed at a large distance, U^s(j)(k), is given by the values along a specific manifold (or spherical “shell”) of the k-space scattering potential, here written as V^(kkj). We illustrate the geometric connection between V^(kkj) and U^s(j)(k) for a 2D optical geometry in Fig. 2(b). The center of the jth shell is defined by the incident wave vector, kj. For a given shell center, each value of V^(kkj) lies on a spherical surface at a radial distance of |k|=k [see colored arcs in Fig. 2(b)]. As kj varies with the changing LED illumination, the shell center shifts along a second shell with the same radius [since kj is itself constrained to lie on an Ewald sphere; see gray circle in Fig. 2(b)].

 figure: Fig. 2.

Fig. 2. Mathematical summary of FPT. (a) The field from the jth LED scatters through the sample and exits its top surface as Uj(x) (in 2D). This field forms U^j(kx) at the microscope back focal plane, where it is bandlimited by the microscope aperture a(kx) before propagating to the image plane to form the jth sampled intensity image. (b) Under the first Born approximation, each detected image is the squared magnitude of the Fourier transform of one colored “shell” in (kx,kz) space. (c) By filling in this space with a ptychographic phase-retrieval algorithm, FPT reconstructs the complex values within the finite bandpass volume V^e(kx,kz) (color indicates expected bowl overlap for this example). The Fourier transform of this reconstruction yields our complex refractive index map with resolutions Δx and Δz along x and z.

Download Full Size | PDF

The goal of DT is to determine all the complex values within the volumetric function V^ from a set of q scattered fields, {U^s}j=1q, which is often measured holographically [12,15]. Each 2D holographic measurement maps to the complex values of V^ along one 2D shell. The values from multiple measurements [i.e., the multiple shells in Fig. 2(b)] can be combined to form a k-space scattering potential estimate, V^e. Nearly all stationary optical setups will yield only an estimate, since it is challenging to measure data from the entire k-space scattering potential without rotating the sample. Figures 1(c) and 2(c) display typical measurable volumes, also termed a bandpass, from a limited-angle illumination and detection setup. Once sampled, an inverse 3D Fourier transform of the band-limited V^e(k) yields the desired complex scattering potential estimate, Ve(r), which contains the quantitative index of refraction.

In FPT, we do not measure the scattered fields holographically. Instead, we use a standard microscope to detect image intensities and apply a ptychographic phase-retrieval algorithm to solve for the unknown complex potential. The scattered fields in Eq. (5) are defined at the microscope objective back focal plane (i.e., its Fourier plane), whose 2D coordinates k2D=(kx,ky) are Fourier conjugate to the microscope focal plane coordinates (x,y). If we neglect the effect of the constant background plane wave term (i.e., Ui in the sum U=Ui+Us), we may now write the jth shifted field at our microscope back focal plane as U^(j)(k2D)=V^(k2Dkj2D,kzkjz). These new coordinates highlight the 3D to 2D mapping from V^ to U^, where again kz=kkx2ky2 is a deterministic function of k2D, and the same applies between kjz and kj2D.

Each shifted, scattered field is then bandlimited by the microscope aperture function, a(k2D), before propagating to the image plane. The limited extent of a(k2D) (defined by the imaging system NA) sets the maximum extent of each shell along kx and ky. The jth intensity image acquired by the detector is given by the squared Fourier transform of the bandlimited field at the microscope back focal plane:

g(x,y,j)=|F[V^(k2Dkj2D,kzkjz)·a(k2D)]|2.

Here, F denotes a 2D Fourier transform with respect to k2D, and we neglect the effects of magnification (for simplicity) by assuming the image plane coordinates match the sample plane coordinates, (x,y). The goal of FPT is to determine the complex 3D function V^ from the real, non-negative data matrix g(x,y,j). A final 3D Fourier transform of V^ yields the desired scattering potential, and subsequently the refractive index distribution, of the thick sample.

B. FPT Reconstruction Algorithm

Equation (6) closely resembles the data matrix measured by FP [40], but now the intensities are sampled from shells within a 3D space (i.e., the curves in Fig. 2). We use an iterative reconstruction procedure, mirroring that from FP [40], to “fill in” the k-space scattering potential with data from each recorded intensity image. Ptychography and FP require at least approximately 50%–60% data redundancy (i.e., overlapping measurements in k-space) to ensure the successful convergence of the phase retrieval process [48]. With such a similar problem structure, FPT will also require overlap between shell regions in 3D k-space. With one extra dimension, overlap is less frequent and more images are needed for an accurate reconstruction. Both a smaller LED array pitch and a larger array-sample distance along z increase the amount of k-space overlap. An example cross section of FPT k-space overlap is shown in Fig. 2(c). As we demonstrate experimentally, several hundred images are sufficient for a complex reconstruction that offers a 4× increase in resolution along (x,y) and contains approximately 30 unique axial slices. Additional overlap (i.e., more images across the same angular range) will increase robustness to noise.

It is important to select the correct limits and discretization of 3D k-space (i.e., the FOV and resolution of the complex sample reconstruction). The maximum resolvable wave vector along kx and ky is proportional to k(NAo+NAi), where NAo is the objective NA and NAi is maximum NA of LED illumination. This lateral spatial resolution limit matches FP [49]. The maximum resolvable wave vector range along kz is also determined as a function of the objective and illumination NA as kzmax=k(21NAo21NAi2). As shown in Ref. [12], this relationship is easily derived from the geometry of the k-space bandpass volume in Fig. 2. We typically specify the maximum imaging range along the axial dimension, zmax, to approximately match twice the expected sample thickness. This then sets the discretization level along kz: Δkz=2π/zmax. The total number of resolved slices along z is set by the ratio kzmax/Δkz.

We now summarize the FPT reconstruction algorithm:

  • 1. Initialize a discrete estimate of the unknown k-space scattering potential, V^e(k), using an appropriate 3D array size (see above). In our experiments, we form a refocused light field with the raw intensity image set and use its 3D Fourier transform for initialization [33]. However, we have noticed that simpler alternative initializers, such as the 3D Fourier transform of a single raw image padded along all three dimensions or a 3D array containing a constant value, also often lead to an accurate reconstruction.
  • 2. For j=1 to q images, compute the center coordinate, kj, and select values along its associated shell (radius k, maximum width 2k·NAo). This selection process samples a discrete 2D function, d^j(kx,ky), from the 3D k-space volume. The selected voxels must partially overlap with voxels from adjacent shells. Currently, no interpolation is used to map voxels from the discrete shell to pixels within d^j(kx,ky).
  • 3. Fourier transform d^j(kx,ky) to the image plane to create dj(x,y) and constrain its amplitudes to match the measured amplitudes from the jth image. For our experiments, we use the amplitude update form, dj(x,y)=g(x,y,j)·dj(x,y)/|dj(x,y)|. More advanced alternating projection-based updates are also available [50].
  • 4. Inverse 2D Fourier transform the image plane update, dj(x,y), back to 2D k-space to form d^j(kx,ky). Use the values of d^j(kx,ky) to replace the voxel values of V^e(k) at locations where voxel values were extracted in step 2.
  • 5. Repeat steps 2–4 for all j=1 to q images. This completes one iteration of the FPT algorithm. Continue for a fixed number of iterations, or until satisfying some error metric. At the end, 3D inverse Fourier transform V^e(k) to recover the complex scattering potential, Ve(r).

In practice, we also implement a pupil function recovery procedure [27] as we update each extracted shell from k-space, which helps remove possible microscope aberrations. As with other alternating projections-based ptychography solvers, the per iteration cost of the above FPT algorithm is O(nlogn), using the big-O notation. Additional details regarding algorithm robustness and convergence are in Supplement 1.

4. RESULTS AND DISCUSSION

We experimentally verify our reconstruction technique using a standard microscope outfitted with an LED array. The microscope uses an infinity corrected objective lens (NAo=0.4, Olympus MPLN, 20×) and a digital detector containing 4.54 μm pixels (Prosilica GX 1920, 1936×1456 pixel count). The LED array contains 31×31 surface-mounted elements (model SMD3528, center wavelength λ=632nm, 20 nm approximate bandwidth, 4 mm LED pitch, 150 μm active area diameter). We position the LED array 135 mm beneath the sample to create a maximum illumination NA of NAi=0.41. This leads to an effective lateral NA of NAo+NAi=0.81 and a lateral resolution gain along (x,y) of slightly over a factor of 2 (from a 1.6 μm minimum resolved spatial period in the raw images to a 0.78 μm minimum resolved spatial period in the reconstruction). The associated axial Nyquist resolution is computed at 3.7 μm. We reconstruct samples across a total depth range of approximately zmax=110μm, which is approximately 20 times larger than the stated objective lens DOF of 5.8 μm.

For most of the reconstructions presented below, we capture and process q=675 images from the same fixed pattern of LEDs. Some LEDs from within the 31×31 array produce images containing a shadow of the microscope back focal plane. We do not use these LEDs so as to avoid certain reconstruction artifacts. We typically use the following parameters for FPT reconstruction: each raw image is cropped to 1000×1000 pixels, the reconstruction voxel size is 0.39μm×0.39μm×3.7μm for Nyquist–Shannon rate sampling, the reconstruction array contains approximately 2100×2100×30 voxels (110 μm total depth), and the algorithm runs for 5 iterations. The first Born approximation should remain valid across this total imaging depth. The primarily transparent and somewhat sparse samples that we test next justify the relatively large 110 μm depth that we typically use.

A. Quantitative Verification

First, we verify the ability of FPT to improve the lateral image resolution. The sample consists of 800 nm-diameter microspheres (index of refraction ns=1.59) immersed in oil (index of refraction no=1.515). We highlight a small group of these microspheres in Fig. 3. The single raw image in Fig. 3(a) (generated from the center LED) cannot resolve the individual spheres gathered in small clusters. Based upon the coherent Sparrow limit for resolving two points (0.68λ/NAo), this raw image cannot resolve points that are closer than 1.1 μm. After FPT reconstruction, we obtain the complex index of refraction in Fig. 3(b), where we show the real component of the recovered index. The FPT reconstruction along the Δz=0 slice clearly resolves the spheres within each cluster. This 800 nm distance is close to the expected Sparrow limit for the FPT reconstruction: 0.68λ/(NAo+NAi)=540nm. The ringing features around each sphere indicate a jinc-like point-spread function, as expected theoretically from the circular shape of the finite FPT bandpass along kx,ky [in Fig. 1(c)] for all reconstructions. The resulting constructive interference forms the undesired dip feature at the center of each cluster.

 figure: Fig. 3.

Fig. 3. Improved lateral resolution with FPT. (a) Single raw image of 0.8 μm microspheres. Beads within each cluster are not resolved. (b) Refractive index (real) from Δz=0 slice (1 of 30) of the FPT reconstruction, resolving each microsphere.

Download Full Size | PDF

Second, we check the quantitative accuracy of FPT by imaging microspheres that extend across more than just a few reconstruction voxels. Figure 4 displays a reconstruction of 12 μm diameter microspheres (index of refraction ns=1.59) immersed in oil (index of refraction no=1.58). We use the same data capture and post-processing steps as in Fig. 3. Here, we display a cropped section (200×200×15 voxels) of the full 3D reconstruction, which required 221 seconds of computation time on a standard laptop. We again display the real (non-absorptive) component of the recovered index across both a lateral slice (along the Δz=0 plane) and a vertical slice (along the Δy=25μm plane). We also include detailed 1D traces along the center of the vertical slice.

 figure: Fig. 4.

Fig. 4. FPT quantitatively measures refractive index in 3D. (a) Tomographic reconstruction of 12 μm microspheres in oil with lateral (Δz=0) slice on left, axial (Δy=25μm) slice on right, and one-dimensional plots of index shift along both x and z. (b) Digitally propagated FP reconstruction (middle) and refocused light field (right) created from the same data. FPT (left) best matches the expected spherical bead profile.

Download Full Size | PDF

Three observations here are noteworthy. First, the measured index shift approximately matches the expected shift of Δn=nsno=0.01 across the entire bead, thus demonstrating quantitatively accurate performance across this limited volume. Currently, we only expect quantitative recovery for samples that meet the first Born approximation condition (ktδn/21). This 12 μm microsphere sample approximately satisfies the condition (ktδn/2=0.59). Second, for any given one-dimensional trace through a microsphere center, we would ideally expect a perfect rect function (between Δn=0 to Δn=0.01). This is unlike 2D FP, which reconstructs the phase delay though each sphere and forms a parabolically shaped phase measurement (due to the varying thickness of each sphere along the optical axis). While FPT resolves an approximate step function through the center of the sphere along the lateral (x) dimension, it does not along the axial (z) direction. This is caused by the limited volume of 3D k-space that FPT measures (i.e., the limited bandpass or “missing cone” of information surrounding the kz axis). While our stationary sample/detector setup cannot avoid this missing cone, various methods are available to computationally fill it in [51].

Finally, we compare FPT with two alternative techniques for 3D imaging in Fig. 4(c). First, we use the same dataset to perform 2D FP and then holographically refocus its reconstructed optical field. We obtain this FP reconstruction using the same number of images (q=675) and follow the procedure in Ref. [40] after focusing the objective lens at the axial center of the 12 μm microspheres. The “out-of-focus noise” above and below the plane of the microsphere, created by digital propagation of the complex field via the angular spectrum method, noticeably hides its spherical shape. Second, we interpret the same raw image set as a light field and perform light-field refocusing [5]. While the refocused light field approximately resolves the outline of microsphere along z, it does not offer a quantitative picture of the sample interior, nor a measure of its complex index of refraction. The areas above the microsphere are very bright due to its lensing effect (i.e., the light field displays the optical intensity at each plane and thus displays high energy where the microsphere focuses light).

We verify the axial resolution of FPT in Fig. 5 using a sample containing two closely separated layers of 2 μm microspheres (ns=1.59) distributed across the surface of a glass slide with oil in between (no=1.515). The axial separation between the two microsphere layers, measured from the center of each sphere along z, is 3.9 μm [i.e., the separation between the microscope slide surfaces is 5.9 μm; see Fig. 5(a)]. This almost matches the expected axial resolution limit of 3.7 μm for the FPT microscope.

 figure: Fig. 5.

Fig. 5. Testing the axial resolution of FPT. (a) The sample contains two layers of microspheres separated by a thin layer of oil. Raw images (b) focused at the center of the two layers and (c) on the top layer do not clearly resolve overlapping microspheres. (d)–(f) Slices of the FPT tomographic reconstruction, showing |Δn|, clearly resolve each sphere within the two individual sphere layers.

Download Full Size | PDF

Conventional microscope images of the sample, using the center LED for illumination, are in Figs. 5(b) and 5(c). Here, we focus on the center of the two layers (Δz=0) as well as the top microsphere layer (Δz=1.9μm) in an attempt to distinguish the two separate layers. At the top of each image (where microspheres in the two layers overlap), it is especially hard to resolve each sphere or determine which sphere is in a particular layer. These challenges are due in part to the limited amount of information contained within the optical intensity at each plane, as opposed to the sample’s complex refractive index.

Next, we return the focus to the Δz=0 plane and implement FPT. We display three slices of our 3D scattering potential reconstruction in Figs. 5(d)5(f). Here, we show the absolute values of the potential near the plane of the top layer, at the center, and near the plane of the bottom layer. The originally indistinguishable spheres within the top and bottom layers are now clearly resolved in each z-plane. Due to the system’s limited axial resolution, the reconstruction at the middle plane (Δz=0) still shows the presence of spheres from both layers. Comparing Figs. 5(b) and 5(c) with Figs. 5(e) and 5(f), it is clear that the axial resolution of FPT is sharper than manual refocusing. Not only is each sphere layer distinguishable (as predicted theoretically), but we now also have quantitative information about the sample’s complex refractive index.

B. Biological Experiments

For our first biological demonstration, we reconstruct a Trichinella spiralis parasite in 3D (see Fig. 6 and Visualization 1 for the complete tomogram). Since the worm extended along a larger distance than the width of our detector, we performed FPT twice and shifted the FOV between to capture the left and right sides of the worm. We then merged each tomographic reconstruction together with a simple averaging operation (matching that from FP [40], 10% overlap). The total captured volume here is 0.8mm×0.4mm×110μm. If our setup included a digital detector that occupied the entire microscope FOV, the fixed imaging volume would be 1.1mm×1.1mm×110μm, and no movement would be needed for this example.

 figure: Fig. 6.

Fig. 6. Tomographic reconstruction of a Trichinella spiralis parasite. (a) The worm’s curved trajectory resolved within various z planes. (b) Refocusing the same distance to each respective plane does not clearly distinguish each in-focus worm segment (marked by white arrows). Since the worm is primarily transparent, in-focus worm sections exhibit minimal intensity contrast, presenting significant challenges for segmentation (see intensity along each black dash in inset plots, where black dash location is in-focus in left image). FPT, on the other hand, exhibits maximum contrast at each worm voxel. See Visualization 1.

Download Full Size | PDF

A thresholded 3D reconstruction of the parasite index is at the top of Fig. 6 (real component, threshold applied at Re[Δn]>0.7 after |Δn| normalized to 1, under-sampled for clarity). The maximum real index variation across the tomogram before normalization is approximately 0.06, which can be seen without thresholding in Visualization 1. Its 3D curved trajectory is especially clear in the three separate z-slices of the reconstructed tomogram in Fig. 6(a). The two downward bends in the parasite body are lower than the upward bend in the middle, as well as at its front and back ends. It is very challenging to resolve these depth-dependent sample features by simply refocusing a standard microscope. Figure 6(b) displays such an attempt, where the same three z planes are brought into focus manually. Since the sample is primarily transparent, in-focus areas in each standard image actually exhibit minimal contrast, as marked by arrows in Fig. 6(b). We plot the intensity through a fixed worm section (black dash) in each of the three insets. The intensity contrast drops by over a factor of 2 at in-focus locations, which will pose a significant challenge to any depth segmentation technique (e.g., focal stack deconvolution [4]). Since FPT effectively offers 3D phase contrast, points along the parasite within its reconstruction voxels instead show maximum contrast, which enables direct segmentation via thresholding, as shown in the plots in Fig. 6(a).

For our second 3D biological example, we tomographically reconstruct a starfish embryo at its larval stage [see Fig. 7(a) and Visualization 2]. Here, we again show three different closely spaced z-slices of the reconstructed scattering potential (Re[Δn], no thresholding applied). Each z-slice contains sample features that are not present in the adjacent z-slices. For example, the large oval structure in the upper left of the Δz=0 plane, which is a developing stomach, nearly completely disappears in the Δz=3.7μm plane. Now at this z-slice, however, small structures, which we expect to be developing mesenchyme cells [52] and various epithelial cells [53], clearly appear in the lower right. We confirm the presence of these structures with a differential interference contrast (DIC) confocal microscope in Fig. 7(c) (Zeiss LSM 510, 0.8 NA, λ=633nm, 0.2 μm scan step). A DIC confocal scan is one of the few possible imaging options for this thick and primarily transparent sample, but suffers from a much smaller FOV (approximately 4% of the total effective FPT FOV). It also does not quantitatively measure the refractive index and requires mechanical scanning. Finally, we attempt to refocus through the embryo using a standard microscope (NA=0.4) in Fig. 7(b). Both the particular plane of the developing stomach and even the presence of the mesenchyme cells are completely missing from the refocused images. This is due to the inability of the standard microscope to segment each particular plane of interest, the inability to accurately reconstruct transparent structures without a phase contrast mechanism, and an inferior lateral resolution with respect to FPT.

 figure: Fig. 7.

Fig. 7. 3D reconstruction of a starfish embryo at larval stage. (a) Three different axial planes of the FPT tomogram show significant feature variation (e.g., protocol is completely missing from Δz=3.7μm plane, expected developing mesenchyme cells are only visible in Δz=3.7μm plane). (b) Such axial information, and even certain structures (e.g., mesenchyme cells and various epithelia cells, marked in (a)) are completely missing from standard microscope images after manual refocusing. (c) A high-resolution DIC confocal scan of the Δz=3.7μm plane confirms presence of structures of interest. See Visualization 2.

Download Full Size | PDF

5. CONCLUSIONS

We have performed diffraction tomography using intensity measurements captured with a standard microscope and an LED illuminator. The current system offers a lateral resolution of approximately 400 nm at the Nyquist–Shannon sampling limit (550 nm at the Sparrow limit and 800 nm full period limit) and an axial resolution of 3.7 μm at the sampling limit. The maximum axial extent attempted thus far was 110 μm along z, and we demonstrated quantitative measurement of the complex index of refraction through several types of thick specimen with contiguous features.

To improve the experimental setup, an alternative LED array geometry that enables a higher angle of illumination will increase the resolution. Also, we set the number of captured images here to match the data redundancy required by ptychography [48]. However, we have observed that reconstructions are successful with much fewer images than otherwise expected. Along with using a multiplexed illumination strategy [28], this may help speed up the tomogram capture time. In addition, we did not explicitly account for the finite LED spectral bandwidth or attempt polychromatic capture, which can potentially provide additional information about volumetric samples [32,35]. Finally, we set our reconstruction range along the z-axis somewhat arbitrarily at 110 μm. We expect the ability to further extend this axial range in the future.

Subsequent work should also examine connections between FPT, multi-slice-based techniques for ptychography [33,34], and machine learning for 3D reconstruction [54]. While prior work already specifies sample conditions under which the first Born [43] and multi-slice [42] approximations remain accurate, it is not yet clear if this translates directly to reconstructions that require phase retrieval. By merging these approaches, it may be possible to increase the domain of sample validity beyond what is currently achieved by each technique independently.

Finally, FPT may also adopt alternative computational tools to help improve ptychographic DT under the first Born approximation. We used the well-known alternating projections phase retrieval update. Other solvers based upon convex optimization [55] or alternative gradient descent techniques [56,57] may perform better in the presence of noise. Alternative approximations besides first Born approximation (e.g., Rytov [15]) are also available to simplify the Born series. In addition, the resolution is currently impacted by the missing cone in 3D k-space, and various methods are available to fill this cone in by assuming the sample is positive only, sparse, or of a finite spatial support [51]. Finally, methods exist to solve for the full Born series by taking into account the effects of multiple scattering [46,47]. Connecting this type of multiple scattering solver to FPT may aid with the reconstruction of increasingly turbid biological samples.

Funding

National Institutes of Health (NIH) (1R01AI096226-01); The Caltech Innovation Initiative (CI2) Program (13520135).

Acknowledgment

The authors would like to thank J. Brake, B. Judkewitz, and I. Papadopoulos for the helpful discussions and feedback.

 

See Supplement 1 for supporting content.

REFERENCES

1. V. Ntziachristos, “Going deeper than microscopy: the optical imaging frontier in biology,” Nat. Methods 7, 603–614 (2010). [CrossRef]  

2. K. Chung and K. Deisseroth, “CLARITY for mapping the nervous system,” Nat. Methods 10, 508–513 (2013). [CrossRef]  

3. F. Chen, P. W. Tillberg, and E. S. Boyden, “Expansion microscopy,” Science 347, 543–548 (2015). [CrossRef]  

4. D. A. Agard, “Optical sectioning microscopy: cellular architecture in three dimensions,” Annu. Rev. Biophys. Bioeng. 13, 191–219 (1984). [CrossRef]  

5. M. Broxton, L. Grosenick, S. Yang, N. Cohen, A. Andalman, K. Deisseroth, and M. Levoy, “Wave optics theory and 3D deconvolution for the light field microscope,” Opt. Express 21, 25418–25439 (2013). [CrossRef]  

6. S. R. P. Pavani and R. Piestun, “Three dimensional tracking of fluorescent microparticles using a photon-limited double-helix response system,” Opt. Express 16, 22048–22057 (2008). [CrossRef]  

7. A. Dubois, L. Vabre, A. C. Boccara, and E. Beaurepaire, “High-resolution full-field optical coherence tomography with a Linnik microscope,” Appl. Opt. 41, 805–812 (2002). [CrossRef]  

8. S. G. Adie, B. W. Graf, A. Ahmad, P. S. Carney, and S. A. Boppart, “Computational adaptive optics for broadband optical interferometric tomography of biological tissue,” Proc. Natl. Acad. Sci. USA 109, 7175–7180 (2012). [CrossRef]  

9. T. E. Matthews, M. Medina, J. R. Maher, H. Levinson, W. J. Brown, and A. Wax, “Deep tissue imaging using spectroscopic analysis of multiply scattered light,” Optica 1, 105–111 (2014). [CrossRef]  

10. N. Streibl, “Three-dimensional imaging by a microscope,” J. Opt. Soc. Am. A 2, 121–127 (1985). [CrossRef]  

11. E. Wolf, “Three-dimensional structure determination of semi-transparent objects from holographic data,” Opt. Commun. 1, 153–156 (1969). [CrossRef]  

12. V. Lauer, “New approach to optical diffraction tomography yielding a vector equation of diffraction tomography and a novel tomographic microscope,” J. Microsc. 205, 165–176 (2002). [CrossRef]  

13. M. Debailleul, B. Simon, V. Georges, O. Haeberle, and V. Lauer, “Holographic microscopy and diffractive microtomography of transparent samples,” Meas. Sci. Technol. 19, 074009 (2008). [CrossRef]  

14. Y. Cotte, F. Toy, P. Jourdain, N. Pavillon, D. Boss, P. Magistretti, P. Marquet, and C. Depeursinge, “Marker-free phase nanoscopy,” Nat. Photonics 7, 113–117 (2013). [CrossRef]  

15. Y. Sung, W. Choi, C. Fang-Yen, K. Badizadegan, R. R. Dasari, and M. S. Feld, “Optical diffraction tomography for high resolution live cell imaging,” Opt. Express 17, 266–277 (2009). [CrossRef]  

16. K. Kim, Z. Yaqoob, K. Lee, J. W. Kang, Y. Choi, P. Hosseini, T. C. So, and Y. Park, “Diffraction optical tomography using a quantitative phase imaging unit,” Opt. Lett. 39, 6935–6938 (2014). [CrossRef]  

17. A. J. Devaney, “Structure determination from intensity measurements in scattering experiments,” Phys. Rev. Lett. 62, 2385–2388 (1989). [CrossRef]  

18. M. H. Maleki, A. J. Devaney, and A. Schatzberg, “Tomographic reconstruction from optical scattered intensities,” J. Opt. Soc. Am. A 9, 1356–1363 (1992). [CrossRef]  

19. M. H. Maleki and A. J. Devaney, “Phase-retrieval and intensity-only reconstruction algorithms for optical diffraction tomography,” J. Opt. Soc. Am. A 10, 1086–1092 (1993). [CrossRef]  

20. T. C. Wedberg and J. J. Stamnes, “Comparison of phase retrieval methods for optical diffraction tomography,” Pure Appl. Opt. 4, 39–54 (1995). [CrossRef]  

21. T. Takenaka, D. J. N. Wall, H. Harada, and M. Tanaka, “Reconstruction algorithm of the refractive index of a cylindrical object from the intensity measurements of the total field,” Microwave Opt. Technol. Lett. 14, 182–188 (1997). [CrossRef]  

22. G. Gbur and E. Wolf, “Diffraction tomography without phase information,” Opt. Lett. 27, 1890–1892 (2002). [CrossRef]  

23. T. E. Gureyev, T. J. Davis, A. Pogany, S. C. Mayo, and S. W. Wilkins, “Optical phase retrieval by use of first Born and Rytov-type approximations,” Appl. Opt. 43, 2418–2430 (2004). [CrossRef]  

24. M. A. Anastasio, D. Shi, Y. Huang, and G. Gbur, “Image reconstruction in spherical-wave intensity diffraction tomography,” J. Opt. Soc. Am. A 22, 2651–2661 (2005). [CrossRef]  

25. Y. Huang and M. A. Anastasio, “Statistically principled use of in-line measurements in intensity diffraction tomography,” J. Opt. Soc. Am. A 24, 626–642 (2007). [CrossRef]  

26. M. D’Urso, K. Belkebir, L. Crocco, T. Isernia, and A. Liftman, “Phaseless imaging with experimental data: facts and challenges,” J. Opt. Soc. Am. A 25, 271–281 (2008). [CrossRef]  

27. X. Ou, G. Zheng, and C. Yang, “Embedded pupil function recovery for Fourier ptychographic microscopy,” Opt. Express 22, 4960–4972 (2014). [CrossRef]  

28. L. Tian, X. Li, K. Ramchandran, and L. Waller, “Multiplexed coded illumination for Fourier Ptychography with an LED array microscope,” Biomed. Opt. Express 5, 2376–2389 (2014). [CrossRef]  

29. S. O. Isikman, W. Bishara, S. Mavandadi, F. W. Yu, S. Feng, R. Lau, and A. Ozcan, “Lens-free optical tomographic microscope with a large imaging volume on a chip,” Proc. Natl. Acad. Sci. USA 108, 7296–7301 (2011). [CrossRef]  

30. T. E. Gureyev, D. M. Paganin, G. R. Myers, Y. I. Nesterets, and S. W. Wilkins, “Phase-and-amplitude computer tomography,” Appl. Phys. Lett. 89, 034102 (2006). [CrossRef]  

31. A. V. Bronnikov, “Theory of quantitative phase-contrast computed tomography,” J. Opt. Soc. Am. A 19, 472–480 (2002). [CrossRef]  

32. T. Kim, R. Zhou, M. Mir, S. D. Babacan, P. S. Carney, L. L. Goddard, and G. Popescu, “White light diffraction tomography of unlabeled live cells,” Nat. Photonics 8, 256–263 (2014). [CrossRef]  

33. L. Tian and L. Waller, “3D intensity and phase imaging from light field measurements in an LED array microscope,” Optica 2, 104–111 (2015). [CrossRef]  

34. P. Li, D. J. Batey, T. B. Edo, and J. M. Rodenburg, “Separation of three-dimensional scattering effects in tilt-series Fourier ptychography,” Ultramicroscopy 158, 1–7 (2015). [CrossRef]  

35. T. D. Gerke and R. Piestun, “Aperiodic volume optics,” Nat. Photonics 10, 1–6 (2010).

36. M. Dierolf, A. Menzel, P. Thibault, P. Schneider, C. M. Kewish, R. Wepf, O. Bunk, and F. Pfeiffer, “Ptychographic X-ray computed tomography at the nanoscale,” Nature 467, 436–439 (2010). [CrossRef]  

37. A. M. Maiden, M. J. Humphry, and J. M. Rodenburg, “Ptychographic transmission microscopy in three dimensions using a multi-slice approach,” J. Opt. Soc. Am. A 29, 1606–1614 (2012). [CrossRef]  

38. T. M. Godden, R. Suman, M. J. Humphry, J. M. Rodenburg, and A. M. Maiden, “Ptychographic microscope for three-dimensional imaging,” Opt. Express 22, 12513–12523 (2014). [CrossRef]  

39. C. T. Putkunz, M. A. Pfeifer, A. G. Peele, G. J. Williams, H. M. Quiney, B. Abbey, K. A. Nugent, and I. McNulty, “Fresnel coherent diffraction tomography,” Opt. Express 18, 11746–11753 (2010). [CrossRef]  

40. G. Zheng, R. Horstmeyer, and C. Yang, “Wide-field, high-resolution Fourier ptychographic microscopy,” Nat. Photonics 7, 739–745 (2013). [CrossRef]  

41. J. V. Roey, J. V. Donk, and P. E. Lagasse, “Beam-propagation method: analysis and assessment,” J. Opt. Soc. Am. 71, 803–810 (1981). [CrossRef]  

42. D. M. Paganin, Coherent X-Ray Optics (Oxford University, 2006).

43. B. Chen and J. J. Stamnes, “Validity of diffraction tomography based on the first Born and the first Rytov approximations,” Appl. Opt. 37, 2996–3006 (1998). [CrossRef]  

44. K. Jaganathan, Y. C. Eldar, and B. Hassibi, “Phase retrieval: an overview of recent developments,” arXiv:1510.07713v1 (2015).

45. K. Nugent, “Coherent methods in the X-ray sciences,” Adv. Phys. 59, 1–99 (2010). [CrossRef]  

46. Y. M. Wang and W. C. Chew, “An iterative solution of the two-dimensional electromagnetic inverse scattering problem,” Int. J. Imaging Syst. Technol. 1, 100–108 (1989). [CrossRef]  

47. G. A. Tsihrintzis and A. J. Devaney, “Higher-order diffraction tomography: reconstruction algorithms and computer simulation,” IEEE Trans. Image Process. 9, 1560–1572 (2000). [CrossRef]  

48. O. Bunk, M. Dierolf, S. Kynde, I. Johnson, O. Marti, and F. Pfeiffer, “Influence of the overlap parameter on the convergence of the ptychographical iterative engine,” Ultramicroscopy 108, 481–487 (2008). [CrossRef]  

49. X. Ou, R. Horstmeyer, G. Zheng, and C. Yang, “High numerical aperture Fourier ptychography: principle, implementation and characterization,” Opt. Express 23, 3472–3491 (2015). [CrossRef]  

50. S. Marchesini, “A unified evaluation of iterative projection algorithms for phase retrieval,” Rev. Sci. Instrum. 78, 011301 (2007). [CrossRef]  

51. K. C. Tam and V. Perezmendez, “Tomographical imaging with limited-angle input,” J. Opt. Soc. Am. 71, 582–592 (1981). [CrossRef]  

52. G. Hamanaka, M. Matsumoto, M. Imoto, and H. Kaneko, “Mesenchyme cells can function to induce epithelial cell proliferation in starfish embryos,” Dev. Dyn. 239, 818–827 (2010). [CrossRef]  

53. A. Agassiz, Embryologic of the Starfish, in Vol. 5 of L. Agassiz Contributions to the Natural History of the United States (Cambridge, 1864), 76 p.

54. U. S. Kamilov, I. N. Papadopoulos, M. H. Shoreh, A. Goy, C. Vonesch, M. Unserand, and D. Psaltis, “Learning approach to optical tomography,” Optica 2, 517–522 (2015). [CrossRef]  

55. R. Horstmeyer, R. C. Chen, X. Ou, B. Ames, J. A. Tropp, and C. Yang, “Solving ptychography with a convex relaxation,” New. J. Phys. 17, 053044 (2015). [CrossRef]  

56. L. Bian, J. Suo, G. Zheng, K. Guo, F. Chen, and Q. Dai, “Fourier ptychographic reconstruction using Wirtinger flow optimization,” Opt. Express 23, 4856–4866 (2015). [CrossRef]  

57. L. H. Yeh, J. Dong, J. Zhong, L. Tian, M. Chen, G. Tang, M. Soltanolkotabi, and L. Waller, “Experimental robustness of Fourier ptychography phase retrieval algorithms,” Opt. Express 23, 33214–33240 (2015). [CrossRef]  

Supplementary Material (3)

NameDescription
Supplement 1: PDF (7846 KB)      Supplemental document.
Visualization 1: MP4 (7134 KB)      Tomographic reconstruction of a Trichinella spiralis parasite.
Visualization 2: MP4 (11279 KB)      3D reconstruction of a starfish embryo at larval stage.

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (7)

Fig. 1.
Fig. 1. Setup for Fourier ptychographic tomography (FPT). (a) Labeled diagram of the FPT microscope, including optical functions of interest. (b) FPT captures multiple images under varied LED illumination. (c) A ptychography-inspired algorithm combines these images in a 3D k-space representation of the complex sample. (d) FPT outputs a 3D tomographic map of the complex index of refraction of the sample. Included images are experimental measurements from a starfish embryo (real index component, threshold applied; see Fig. 7).
Fig. 2.
Fig. 2. Mathematical summary of FPT. (a) The field from the jth LED scatters through the sample and exits its top surface as Uj(x) (in 2D). This field forms U^j(kx) at the microscope back focal plane, where it is bandlimited by the microscope aperture a(kx) before propagating to the image plane to form the jth sampled intensity image. (b) Under the first Born approximation, each detected image is the squared magnitude of the Fourier transform of one colored “shell” in (kx,kz) space. (c) By filling in this space with a ptychographic phase-retrieval algorithm, FPT reconstructs the complex values within the finite bandpass volume V^e(kx,kz) (color indicates expected bowl overlap for this example). The Fourier transform of this reconstruction yields our complex refractive index map with resolutions Δx and Δz along x and z.
Fig. 3.
Fig. 3. Improved lateral resolution with FPT. (a) Single raw image of 0.8 μm microspheres. Beads within each cluster are not resolved. (b) Refractive index (real) from Δz=0 slice (1 of 30) of the FPT reconstruction, resolving each microsphere.
Fig. 4.
Fig. 4. FPT quantitatively measures refractive index in 3D. (a) Tomographic reconstruction of 12 μm microspheres in oil with lateral (Δz=0) slice on left, axial (Δy=25μm) slice on right, and one-dimensional plots of index shift along both x and z. (b) Digitally propagated FP reconstruction (middle) and refocused light field (right) created from the same data. FPT (left) best matches the expected spherical bead profile.
Fig. 5.
Fig. 5. Testing the axial resolution of FPT. (a) The sample contains two layers of microspheres separated by a thin layer of oil. Raw images (b) focused at the center of the two layers and (c) on the top layer do not clearly resolve overlapping microspheres. (d)–(f) Slices of the FPT tomographic reconstruction, showing |Δn|, clearly resolve each sphere within the two individual sphere layers.
Fig. 6.
Fig. 6. Tomographic reconstruction of a Trichinella spiralis parasite. (a) The worm’s curved trajectory resolved within various z planes. (b) Refocusing the same distance to each respective plane does not clearly distinguish each in-focus worm segment (marked by white arrows). Since the worm is primarily transparent, in-focus worm sections exhibit minimal intensity contrast, presenting significant challenges for segmentation (see intensity along each black dash in inset plots, where black dash location is in-focus in left image). FPT, on the other hand, exhibits maximum contrast at each worm voxel. See Visualization 1.
Fig. 7.
Fig. 7. 3D reconstruction of a starfish embryo at larval stage. (a) Three different axial planes of the FPT tomogram show significant feature variation (e.g., protocol is completely missing from Δz=3.7μm plane, expected developing mesenchyme cells are only visible in Δz=3.7μm plane). (b) Such axial information, and even certain structures (e.g., mesenchyme cells and various epithelia cells, marked in (a)) are completely missing from standard microscope images after manual refocusing. (c) A high-resolution DIC confocal scan of the Δz=3.7μm plane confirms presence of structures of interest. See Visualization 2.

Equations (6)

Equations on this page are rendered with MathJax. Learn more.

V(r)=k4π(n2(r)nb2).
Us(r)=G(|rr|)V(r)U(r)dr.
Ui(j)(r)=exp(1ikj·r),
kj=(kjx,kjy,kjz)=k(sinθjx,sinθjy,1sin2θjxsin2θjy).
U^s(j)(k)=V^(kkj).
g(x,y,j)=|F[V^(k2Dkj2D,kzkjz)·a(k2D)]|2.
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.