Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Simplified approach to diffraction tomography in optical microscopy

Open Access Open Access

Abstract

We present a novel microscopy technique to measure the scattered wavefront emitted from an optically transparent microscopic object. The complex amplitude is decoded via phase stepping in a common-path interferometer, enabling high mechanical stability. We demonstrate theoretically and practically that the incoherent summation of multiple illumination directions into a single image increases the resolving power and facilitates image reconstruction in diffraction tomography. We propose a slice-by-slice object-scatter extraction algorithm entirely based in real space in combination with ordinary z-stepping. Thereby the computational complexity affiliated with tomographic methods is significantly reduced. Using the first order Born approximation for weakly scattering objects it is possible to obtain estimates of the scattering density from the exitwaves.

©2009 Optical Society of America

1. Introduction

Most biological objects are optically transparent, yielding low contrast images in brightfield microscopy. With the invention of phase-contrast microscopy [1] and later the introduction of differential interference contrast [2], powerful methods became available to turn phase information into high contrast intensity variations. This led to a revolution in biological imaging, as the stunning contrast gain enabled the observation of objects that were nearly invisible before.

The second revolution in microscopy was initiated with the discovery of the Green Fluorescent Protein (GFP) [3] and its application as a fluorescent marker. The ability to express fluorescent proteins specifically as a part of a molecule or molecular assemblies has turned fluorescence microscopy into the method of choice in cell biology studies.

Nevertheless, non fluorescent imaging modes are regaining more and more attention, as they offer high signal levels, no phototoxical effects associated with staining/labelling, and basically unlimited observation time [4, 5].

With the introduction of phase stepping interferometry [6, 7] and digital holographic methods [810] to microscopy, it became possible to measure the phase shift introduced by a specimen quantitatively, opening new possibilities in metrology and cell imaging [1114].

Although these techniques can capture the three-dimensional topography of a surface or the phase shifts introduced by a cell in axial direction, they are not truly three-dimensional imaging techniques in the sense that volumetric sample information would be accessible [15]. Additionally, holographic and interferometric approaches typically employ an axial, coherent illumination, which also limits their lateral resolving power.

For true 3D imaging of the local scattering density, multiple illumination directions are necessary, which was attempted in several tomographic studies in optical microscopy [1621]. Interestingly, fundamentally different assumptions for the image formation process were made. In several tomographic studies [16, 17], using specimen rotation or beam scanning, it was assumed that the light propagates straight through the specimen and integrates the phase information. Consequently the Fourier slice theorem was applied for reconstruction as it is well established in x-ray tomography.

In recent work, the framework of weakly scattering objects, as outlined by Born and Wolf [22], was pursued [1821]. Using the first order Born approximation, this implies that the object information for one illumination direction lies on a sphere rather than on a slice. As the first order Born approximation takes diffraction into account, this model is physically more sound for optical microscopy. Unfortunately the complexity of the data post processing is significantly increased, and a large number (~1000) of different illumination directions are necessary to properly sample the object information in reciprocal space [18].

In this paper we present an alternative approach to diffraction tomography where we incorporate the coherent 3D image formation in a light microscope. The object is not illuminated sequentially from different directions but with an azimuthally rotating laser beam. Upon one rotation, the information of multiple projections is incoherently superimposed on an image detector. The phase and amplitude information is decoded using a common path interferometer and phase stepping. For three dimensional imaging, a stack of images is acquired using z-stepping, while for each focal position, phase stepping is performed.

We analyze the image formation and the resulting transfer functions for phase and amplitude in the context of the first order Born approximation. Owing to the three-dimensional image formation and the design of the common path interferometer, the phase and amplitude information is automatically placed at the proper position in three dimensional frequency space. This results in a simple and robust data processing, however at the price of a damped transfer function.

We present experimental data that is in good agreement with our theory and demonstrate the validity of the first order Born approximation for weakly scattering objects.

2. Theory

The first-order Born approximation linearly relates the Fourier components of an object (more precisely the Fourier components of its scattering potential) to the complex amplitude of the scattered wave in the far field in the following way:

K=kki

Where K is the spatial frequency of the object, k is the wave vector of the diffracted wave and ki is the wave vector of the illumination wave. By geometrical construction one can show that all the observable object information lies on a sphere that is centered at -ki in reciprocal space [22]. Commonly this sphere is called the Ewald sphere in X-ray diffraction. Wolf first proposed that one could use varying illumination directions to reconstruct all Fourier components of an object in a spherical volume, a technique that is known as diffraction tomography [23].

To understand how the far field of the scattered wave can be measured we will discuss the three-dimensional image formation for coherent illumination in a microscope (3D image acquisition is discussed in more detail in Appendix A). In the case of monochromatic plane wave illumination, all object information that can be transported with the electromagnetic field lies on a spherical shell in reciprocal space [24]. Using an objective with a limited solid acceptance angle, only a subset of this information lying on a spherical cap can be captured. Upon image formation, the intensity that is measured on a detector in the image plane contains the following terms:

I=|R+O|2=RR*+OO*+RO*+R*OR=r0ei(φ+k0ω)

Where * denotes the complex conjugation and R is the complex amplitude of the zero-order light (which we will later use as the reference wave), which is a plane wave travelling in the direction of wave vector k0. O is the complex amplitude of the object wave (light scattered by the object). In Fourier space, Eq. (2) translates into:

I˜=R˜×R˜+O˜×O*˜+R˜×O˜+R*˜×O˜R˜=r0δ(kk0)eiφ

Where ×denotes the convolution operation. If we assume that the magnitude of O˜ is much smaller than R˜, the autocorrelation term O˜×O*˜can be neglected. So the remaining transferred information in reciprocal space consists mainly of two shells, one containing the complex amplitude information (R˜×O˜) of the object, the other the complex conjugated information (R˜×O˜). It is important to note that the shell O˜ is centered at -k0 as a result of the convolution with R˜ (and vice versa for (O˜).

In Fig. 1(a) the case for a plane wave illumination along the optical axis is depicted. The solid line depicts the shell containing the desired object information (O˜) and the dashed line represents the conjugated and point mirrored information (O˜), due to conjugation in real space.

 figure: Fig. 1

Fig. 1 Information transfer upon imaging a weakly scattering object: (a) Using coherent, axial illumination. (b) Using two oblique incoherent illumination directions. The solid line illustrates the transferred object information, the dotted line represents the corresponding complex conjugated information. (c) Sampling the reciprocal space using illumination directions with fixed incident angle but varying azimuth.

Download Full Size | PDF

If one uses an oblique plane wave illumination, information is still transferred on a spherical shell by the microscope, however it belongs to another place in reciprocal space. By the interference of the zero order light with the object wave (R˜×O˜), the shell is translated automatically to the correct position in reciprocal space, as required by Eq. (1).

Assuming that we are able to separate O from the other terms in Eq. (2), this suggests that the three-dimensional image formation of R*O performs the correct mapping of the object’s Fourier components within the accuracy of the first order Born approximation.

If one applies sequential illumination over all possible illumination directions (a sum in real space corresponds to a sum in Fourier space) one can measure the Fourier components in a finite volume that would have the same support as the well known incoherent widefield optical transfer function (OTF) in fluorescence microscopy [21]. However, as the shells are virtually infinitesimally thin using monochromatic light, it would take a large number of different illumination directions to properly sample this volume in reciprocal space.

Alternatively, one can extend the transfer function using multiple illumination directions at the same time. If the illuminating plane waves are mutually incoherent, multiple shells can be observed simultaneously and are added up incoherently. In Fig. 1(b) two shells are added in reciprocal space using two illumination directions along the wave vectors k1 and k2. The solid lines depict the shells containing the desired object information and the dashed lines represent the conjugated information.

One extended implementation of this idea is incoherent ring illumination in the back focal plane (BFP) of the condenser, resulting in a hollow cone illumination, as it is commonly implemented in phase contrast microscopy. Noda et al. described the image transfer using such an illumination scheme for weakly scattering objects [25]. In this case the summation of shells creates a transfer function that fills a finite volume in reciprocal space.

As an alternative to the ring illumination, one can spin a focused spot of a coherent light source on a circular orbit in the BFP of the condenser. When the time for one orbit is equal to (or much smaller) than the integration time of the detector, the corresponding shells are incoherently added in reciprocal space, as schematically illustrated in Fig. 1(c). The idea of using rotating laser illumination was first proposed by Ellis for phase contrast microscopy [26] and was later employed to achieve even illumination in TIRF microscopy [2729].

3. Experimental set-up

In Fig. 2 , the optical train of our set-up is depicted. Laser light of 532nm wavelength is scrambled to lower its spatial coherence by coupling it into a multimode fiber that was shaken with a loudspeaker (square wave drive at 800Hz). After the fiber, the laser beam is randomly polarized and no longer single mode. A pinhole is used to re-collimate the laser beam. A two axis piezo tip/tilt mirror (S334, Physik Instrumente, Germany) is used to scan the laser beam in the BFP of the condenser objective (63 × NA 1.2 Zeiss Apochromat). Consequently a collimated laser beam emerges from the condenser objective.

 figure: Fig. 2

Fig. 2 Optical train for rotating illumination and inline interferometry.

Download Full Size | PDF

Azimuthal rotation of the laser beam is achieved by spinning the focused spot on a circular trajectory in the BFP at a frequency of 15 Hz. A second objective (63 × NA 1.2 Zeiss Apochromat) is used for imaging the specimen located in between the two objectives.

The transmitted wave, consisting of an object wave and the undeviated zero-order wave, is linearly polarized by a Glan-Thompson polarizer and is partially transmitted through a nonpolarizing beam splitter, both located in an infinity corrected beam path. The wavefronts are subsequently focused on a phase-only spatial light modulator (Heo 1080 P, Holoeye, Berlin) that is located in a Fourier plane. The spatial light modulator (SLM) is used to phase shift the zero-order component of the transmitted wave. To this end, four annular masks, adjusted to the trajectory of the zero order laser spot and having the required phase offsets of 0, π/2, π and 3π/2, are sequentially displayed on the SLM. The phase offsets were calibrated using a home built interferometer.

The object wave and the phase-modulated zero order light are back-reflected by the SLM and subsequently partially reflected by the beam splitter onto a cooled CCD detector (ORCA ERG, Hamamatsu, Japan) where they form an interference image. The integration time of the camera was carefully adjusted to the rotation speed of the laser beam so that an integer number of rotations per integration time is achieved. For optical sectioning, both objectives are moved synchronously using two Z-drives (actuated with DC motors with optoelectronic encoding).

The Z-drives of the objectives, the piezo tip/tilt mirror, the SLM and the camera were all controlled with a home made LabView program.

4. Phase stepping interferometry

To recover the complex amplitude of the object wave, we use phase stepping interferometry. The complex amplitude can be computed (see Appendix B) as follows from a series of intensity images I1..I4, where the phase of the reference wave φi was varied:

Or0=(eiφ1I1+eiφ2I2+eiφ3I3+eiφ4I4)*/4φ1=0;φ2=π/2;φ3=π;φ4=3π/2;

It is important to note that the recovered amplitude of the object wave is scaled by the amplitude strength (real valued) of the reference wave and that the initial reference phase is not necessarily zero. In case the initial phase φ1 is nonzero, this results in a global phase offset in the phase measurement. The unknown information of the global phase can be retrieved defining a zero reference phase for a flat region in the reconstructed background.

In reciprocal space, the phase stepping procedure removes all the information except for the O˜- shells corresponding to the complex amplitude of the object wave (see Appendix B).

In the case of rotating illumination, the intensity on the detector is the sum of all the interference terms corresponding to the different illumination directions. When the summation is incoherent and the same phase steps are applied for all different directions, the outlined phase stepping scheme can also recover the sum of the object waves’ amplitudes associated with these different directions.

5. Simulation of the amplitude transfer function

Using the rotating illumination, information within a finite volume in reciprocal space becomes observable. In analogy to the OTF in fluorescence microscopy, we describe this finite volume with the amplitude transfer function (ATF). The ATF describes the weighting of phase and amplitude information in reciprocal space. The region where the ATF is nonzero defines the support of the ATF.

We simulated the ATF for rotating illumination under different incident angles for our set-up using an NA = 1.2 objective and a wavelength of 532nm. For simplicity, we assumed an ideal uniform pupil function. We computed the convolution of a spherical cap (solid angle defined by the NA) with the locations of all reference waves (a ring) in reciprocal space. To achieve a smooth approximation, we employed a fine discretization (sampled well above the requirements set by the Nyquist criterion) of reciprocal space resulting in a computational domain of 512 × 512 × 512 voxels. In Fig. 3 kz-kx cross-sections of ATFs corresponding to rotating illuminations with incident angles of 15° to 64° are represented on a logarithmic scale. The ATF is always rotationally symmetric around the kz axis but is asymmetric to the lateral plane. Associated with the limited acceptance angle of the objective, all ATFs feature a missing cone region, which is unavoidable in a coherent transmission imaging system.

 figure: Fig. 3

Fig. 3 Simulated amplitude transfer functions (ATF) for rotating illumination under different incident angles. The ATF is rotationally symmetric around the kz axis.

Download Full Size | PDF

Increasing the incident angle of illumination extends the lateral support ATF towards the well known Abbe diffraction limit (maximum transferable spatial frequency equals to 2NA/λ), however the support becomes less uniform and more asymmetric.

Further analysis of the ATF (data not shown) revealed that the lateral resolving power increases for larger incident angles, however it starts to stagnate for very high incident angles, as the signal strength within the ATF is reduced. Based on these findings we have chosen an incident angle of 30° for our first experiments, as we estimated an adequate signal strength within the ATF.

6. Experimental results

We used human cheek cells, sandwiched between two cover slips, to validate our theory. An individual 3D data stack for each phase step was acquired using axial illumination and azimuthal rotating illumination under an incident angle of 30°. A z-step value of 200nm was chosen to satisfy the Nyquist criterion in the z-direction. Figure 4 displays a cross-section of the 3D Fourier transform of a raw data stack using axial (a) and rotating illumination (b). As predicted by the theory, two shells touching at the origin in reciprocal space can be clearly seen in Fig. 4(a). In Fig. 4(b) one can detect additional information that has been transferred.

 figure: Fig. 4

Fig. 4 Fourier transforms (kx-kz cross-sections) of a human cheek cell data set: (a) Raw data for axial illumination. (b) Raw data for rotating illumination. (c) Recovered object information for axial illumination after applying the phase stepping algorithm. (d) Recovered object information for rotating illumination after applying the phase stepping algorithm and (e) after rotational averaging around the kz axis.

Download Full Size | PDF

The phase stepping algorithm was applied to the four data stacks in real space and the resulting data stack containing the recovered amplitude of the object wave was Fourier transformed.

In Fig. 4(c), the Fourier transform of the object wave using axial illumination is shown. The phase stepping routine was able to recover the shell corresponding to the object wave alone and the complex conjugated shell was removed properly. In Fig. 4(d) the recovered object information for rotating illumination is shown, in Fig. 4(e), the magnitude of this data was rotationally averaged along the kz-axis. The recovered information lies within the region predicted by the simulated ATF, however towards the outside of the ATF, signal strength is weaker than predicted. By comparing Figs. 4(c) and 4(e), it is still evident that information within a finite volume can be transferred using rotating illumination, whereas the information in the pure axial illumination lies on a shell only.

To illustrate the effect of the enlarged transfer function in real space, we imaged 100nm silica microspheres using axial illumination and rotating illumination under 30° incident angle. In Figs. 5(a) and 5(b), the recovered phase information for the axial and the rotating illumination is shown. In Fig. 5(a), strong non-uniform background phase is present, whereas in Fig. 5(b) the background is very even. This can be expected, as the axial illumination scheme has virtually no z-resolving power, therefore out of focus objects strongly contribute to the final image. In contrast, the rotating illumination, owing to the finite volume of its ATF support, has a certain axial resolution. Thereby objects that are completely out of focus get averaged out. The small insets in Figs. 5(a) and 5(b) also illustrate the enhanced lateral resolution using rotating illumination.

 figure: Fig. 5

Fig. 5 Recovered phase image of groups of 100nm silica microspheres under (a) axial illumination and (b) under rotating illumination.

Download Full Size | PDF

In Fig. 6 the comparison of a bright field image, a phase contrast image and the phase channel under rotating illumination (30° incident angle) of a human cheek cell are shown. The bright field image in Fig. 6(a) and the phase contrast image in Fig. 6(b) were extracted from the phase stepping data. The bright field image suffers from low contrast, whereas in phase contrast fine details are clearly visible. The recovered phase information shown in Fig. 6(c) offers the same level of detail, however absorption and phase information are now separated. Therefore the large scale spatial intensity variations, present in the bright field and phase contrast image, do not affect the phase measurement, illustrating the decoupling of the information channels.

 figure: Fig. 6

Fig. 6 Human cheek cell as imaged in brightfield microscopy (a) and phase contrast microscopy (b). (c) Recovered phase information using rotating illumination. (d) Recovered phase information in an x-z plane. The image data is equally sampled in the x and z direction. (e) Magnified version of the boxed area in (d).

Download Full Size | PDF

In Fig. 6(d) an x-z cross-section along the dashed line in Fig. 6(c) is shown. To increase the contrast, the phase map was gamma corrected by 10%. Several isolated scattering centers can be distinguished at different heights. Out-of-focus blur is observable that is located in funnel-shaped regions. Figure 6(e) displays a magnified version of the boxed area in Fig. 6(d).

7. Discussion

We have presented a new simplified approach to diffraction tomography in optical microscopy. Our technique enlarges the transfer function in reciprocal space by incoherently adding information from multiple illumination directions. This reduces the amount of image data that has to be acquired, but at the price of a damped transfer function. However, we believe that this is not a disadvantage of our technique since it is inherent to the Fourier diffraction theorem. Also, in the classical approach where information is sampled for different illumination directions sequentially, some regions in reciprocal space will be highly sampled (e.g. close to the origin), giving rise to better quality signal, whereas other regions will be very poorly sampled. This effect is evident in Fig. 1(c).

A ring like illumination source was chosen as it provides rotational symmetry in the lateral plane. However, the corresponding ATF is asymmetric in a kx-kz plane. Using our approach, other illumination patterns could also be implemented: sweeping the laser spot on radial lines in the BFP would result in transfer functions that are symmetric to the kx-ky plane but anisotropic in the lateral plane. To increase the lateral symmetry, one could fuse data sets using radial line sweeps that were azimuthally rotated by 120° to synthesize a more symmetric transfer function.

Phase stepping interferometry was chosen because it can be realized in a common path geometry that offers high phase stability. Thereby potential phase errors, coming from vibrations, the scanning process, and thermal expansion during data acquisition are greatly reduced compared to a system with an external reference arm. However, the annular phase mask is not ideal: The mask should only act on the zero order light. As the diffracted light generally fills the whole aperture of the objective, a small part will intercept with the mask, causing halo artefacts around objects [1].

To address this shortcoming, off-axis holography using a co-rotating reference wave could be used to decode the object’s amplitude. Recently a microscopy system for off-axis holography was presented that was using a common path geometry, however it was implemented only for a fixed axial illumination direction [30].

The optical train of our system will introduce phase errors due to non-flat surfaces and aberrations caused by the various lenses. It is important to note that for our rotating illumination scheme, one can compensate these errors with a background measurement, if necessary.

A fundamental problem in beam scanning optical tomography are missing projections due to the finite solid angle of an objective, leading to the well known missing cone problem inherent to widefield microscopy. As the available object information in the kz-direction is very limited, one cannot map the refractive index directly, instead a certain optical path length is measured. Furthermore the missing cone region introduces out-of-focus blur that can obscure the in-focus image.

Choi et al. used iterative deconvolution algorithms to fill the missing cone in order to obtain refractive index maps in beam scanning optical tomography [17]. However, this information is not physically transmitted and certain previous knowledge about the object is necessary. Currently the only way to fill the missing cone is sample rotation [16], however until now this was experimentally only done in the context of the Fourier slice theorem. Using the first-order Born approximation, a rotating tomographic scheme without beam scanning results in another missing cone region along the axis of rotation [31]. Ultimately, optical tomography using the first-order Born approximation has to be performed using beam scanning and sample rotation to fill an isotropic volume in reciprocal space.

8. Outlook

Phase microscopy is regaining popularity with the advent of holographic and interferometric measurement schemes. Different approaches are followed and quite elaborate systems have been proposed and recently realized. It was the concern of the authors to present an alternative, simplified approach that may help to evaluate the different assumptions about the object’s scattering behaviour that have been taken.

Appendix A: 3D image acquisition

As we are performing image acquisition slice-by-slice in real space, it is necessary to analyze what the slice-by-slice acquisition of image data in real space means in Fourier space. A raw 2D image in real space captures the incoherent sum of plane-wave holograms differing in illumination angle. Out of the 3D amplitude distribution (in sample coordinate space) of a single such constituent hologram, only a single in-focus plane is captured on the CCD camera. The processing which allows us to extract amplitudes is linear and pulls right through the Fourier transformation. With the real-space coordinate system centered at the plane of focus, this means that an extracted 2D real-space scattering amplitude can be described in Fourier space as a sum projection of the 3D scattering amplitude along kz of 3D amplitude space. With now a new reference coordinate system fixed to the sample, each newly acquired 2D real space plane contributes another z-projection in Fourier space, but with an extra exp(ikzΔz)multiplied in Fourier space, which accounts for the shift in coordinate space. If from a series of slices we now want to reconstruct a 3D amplitude scattering distribution, we can do slice-wise Fourier transformations and then an extra Fourier transformation along z to get the 3D distribution in reciprocal space. Of course this corresponds to a 3D Fourier transformation and confirms that we simply acquire the 3D real-space distribution of scatterers with our slice-wise image acquisition and object extraction. Note that a corresponding description is possible for any kind of 3D (e.g. fluorescent wide field or confocal laser scanning) microscopy. For this procedure to work, the object scatter should be describable by the first Born approximation and Nyquist sampling along x, y and z has to be obeyed to allow for the finally achievable transfer function not to be aliased.

Appendix B: Derivation of the phase stepping algorithm

Four images with shifted reference phase are acquired, resulting in the following intensities on the detector:

I1=|R1+O|2=|R1|2+|O|2+R1O*+OR1*;I2=|R2+O|2;I3=|R3+O|2;I4=|R4+O|2;Ri=r0eiφi

In order to recover the complex amplitude of the object wave O, we multiply the phase-shift angles on the images and sum the data sets up:

I1+eiφ2I2+eiφ3I3+eiφ4I4

Where we have assumed φ1=0

This expression can be expanded as follows:

4r0O*+(Or0)(1+e2iφ2+e2iφ3+e2iφ4)_+(|r0|2+|O|2)(1+eiφ2+eiφ3+eiφ4)_

The underlined sums equal zero using the following phase shifts:

φ2=π/2;φ3=π;φ1=3π/2

Thereby the conjugated object wave, scaled with four times the amplitude strength (real valued) of the reference wave, can be isolated from the other terms. A similar scheme can be derived for three or more than four phase steps. The phase steps must always evenly sample the interval of 2π.

References and links

1. F. Zernike, “How I discovered phase contrast,” Science 121(3141), 345–349 (1955). [CrossRef]   [PubMed]  

2. R. D. Allen, G. B. David, and G. Nomarski, “The zeiss-Nomarski differential interference equipment for transmitted-light microscopy,” Z. Wiss. Mikrosk. 69(4), 193–221 (1969). [PubMed]  

3. O. Shimomura, “The discovery of aequorin and green fluorescent protein,” J. Microsc. 217(1), 3–15 (2005). [CrossRef]  

4. L. Liu, J. R. Trimarchi, R. Oldenbourg, and D. L. Keefe, “Increased birefringence in the meiotic spindle provides a new marker for the onset of activation in living oocytes,” Biol. Reprod. 63(1), 251–258 (2000). [CrossRef]   [PubMed]  

5. M. Shribak and S. Inoué, “Orientation-independent differential interference contrast microscopy,” Appl. Opt. 45(3), 460–469 (2006). [CrossRef]   [PubMed]  

6. N. Lue, W. Choi, G. Popescu, T. Ikeda, R. R. Dasari, K. Badizadegan, and M. S. Feld, “Quantitative phase imaging of live cells using fast Fourier phase microscopy,” Appl. Opt. 46(10), 1836–1842 (2007). [CrossRef]   [PubMed]  

7. G. Popescu, L. P. Deflores, J. C. Vaughan, K. Badizadegan, H. Iwai, R. R. Dasari, and M. S. Feld, “Fourier phase microscopy for investigation of biological structures and dynamics,” Opt. Lett. 29(21), 2503–2505 (2004). [CrossRef]   [PubMed]  

8. E. Cuche, F. Bevilacqua, and C. Depeursinge, “Digital holography for quantitative phase-contrast imaging,” Opt. Lett. 24(5), 291–293 (1999). [CrossRef]  

9. T. Ikeda, G. Popescu, R. R. Dasari, and M. S. Feld, “Hilbert phase microscopy for investigating fast dynamics in transparent systems,” Opt. Lett. 30(10), 1165–1167 (2005). [CrossRef]   [PubMed]  

10. U. Schnars and W. P. Juptner, “Direct recording of holograms by a CCD target and numerical reconstruction,” Appl. Opt. 33(2), 179–181 (1994). [CrossRef]   [PubMed]  

11. F. Charrière, J. Kühn, T. Colomb, F. Montfort, E. Cuche, Y. Emery, K. Weible, P. Marquet, and C. Depeursinge, “Characterization of microlenses by digital holographic microscopy,” Appl. Opt. 45(5), 829–835 (2006). [CrossRef]   [PubMed]  

12. H. Ding, F. Nguyen, S. A. Boppart, and G. Popescu, “Optical properties of tissues quantified by Fourier-transform light scattering,” Opt. Lett. 34(9), 1372–1374 (2009). [CrossRef]   [PubMed]  

13. H. Ding, Z. Wang, F. Nguyen, S. A. Boppart, and G. Popescu, “Fourier transform light scattering of inhomogeneous and dynamic structures,” Phys. Rev. Lett. 101(23), 238102 (2008). [CrossRef]   [PubMed]  

14. B. Kemper, D. Carl, J. Schnekenburger, I. Bredebusch, M. Schafer, W. Domschke, and G. von Bally, “Investigation of living pancreas tumor cells by digital holographic microscopy,” J. Biomed. Opt. 11(3), 034005–034008 (2006). [CrossRef]  

15. S. S. Kou and C. J. Sheppard, “Imaging in digital holographic microscopy,” Opt. Express 15(21), 13640–13648 (2007). [CrossRef]   [PubMed]  

16. F. Charrière, A. Marian, F. Montfort, J. Kuehn, T. Colomb, E. Cuche, P. Marquet, and C. Depeursinge, “Cell refractive index tomography by digital holographic microscopy,” Opt. Lett. 31(2), 178–180 (2006). [CrossRef]   [PubMed]  

17. W. Choi, C. Fang-Yen, K. Badizadegan, S. Oh, N. Lue, R. R. Dasari, and M. S. Feld, “Tomographic phase microscopy,” Nat. Methods 4(9), 717–719 (2007). [CrossRef]   [PubMed]  

18. M. Debailleul, B. Simon, V. Georges, O. Haeberlé, and V. Lauer, “Holographic microscopy and diffractive microtomography of transparent samples,” Meas. Sci. Technol. 19(7), 074009 (2008). [CrossRef]  

19. N. Fukutake and T. D. Milster, “Proposal of three-dimensional phase contrast holographic microscopy,” Opt. Express 15(20), 12662–12679 (2007). [CrossRef]   [PubMed]  

20. Y. Sung, W. Choi, C. Fang-Yen, K. Badizadegan, R. R. Dasari, and M. S. Feld, “Optical diffraction tomography for high resolution live cell imaging,” Opt. Express 17(1), 266–277 (2009). [CrossRef]   [PubMed]  

21. V. Lauer, “New approach to optical diffraction tomography yielding a vector equation of diffraction tomography and a novel tomographic microscope,” J. Microsc. 205(2), 165–176 (2002). [CrossRef]   [PubMed]  

22. M. Born, and E. Wolf, Principles of optics,7th ed. (Cambridge University press, 2005).

23. E. Wolf, “Three-dimensional structure determination of semi-transparent objects from holographic data,” Opt. Commun. 1(4), 153–156 (1969). [CrossRef]  

24. M. G. Gustafsson, D. A. Agard, and J. W. Sedat, “Sevenfold improvement of axial resolution in 3D wide-field microscopy using two objective-lenses,” Proc. SPIE 2412, 147–156 (1995). [CrossRef]  

25. T. Noda, S. Kawata, and S. Minami, “Three-dimensional phase contrast imaging by an annular illumination microscope,” Appl. Opt. 29(26), 3810–3815 (1990). [CrossRef]   [PubMed]  

26. G. W. Ellis, “An Annular Scan Phase-Contrast Scanned Aperture Microscope (Aspsam),” Cell Motil. Cytoskeleton 10, 342–342 (1988).

27. A. L. Mattheyses and D. Axelrod, “Effective elimination of laser interference fringing in fluorescence microscopy by spinning azimuthal incidence angle,” Biophys. J. 88, 341A–342A (2005).

28. R. Fiolka, Y. Belyaev, H. Ewers, and A. Stemmer, “Even illumination in total internal reflection fluorescence microscopy using laser light,” Microsc. Res. Tech. 71(1), 45–50 (2008). [CrossRef]  

29. M. van ’t Hoff, V. de Sars, and M. Oheim, “A programmable light engine for quantitative single molecule TIRF and HILO imaging,” Opt. Express 16(22), 18495–18504 (2008). [CrossRef]   [PubMed]  

30. G. Popescu, T. Ikeda, R. R. Dasari, and M. S. Feld, “Diffraction phase microscopy for quantifying cell structure and dynamics,” Opt. Lett. 31(6), 775–777 (2006). [CrossRef]   [PubMed]  

31. S. Vertu, J.-J. Delaunay, I. Yamada, and O. Haeberlé, “Diffraction microtomography with sample rotation: influence of a missing apple core in the recorded frequency space,” Cent. Eur. J. Phys. 7(1), 22–31 (2009). [CrossRef]  

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (6)

Fig. 1
Fig. 1 Information transfer upon imaging a weakly scattering object: (a) Using coherent, axial illumination. (b) Using two oblique incoherent illumination directions. The solid line illustrates the transferred object information, the dotted line represents the corresponding complex conjugated information. (c) Sampling the reciprocal space using illumination directions with fixed incident angle but varying azimuth.
Fig. 2
Fig. 2 Optical train for rotating illumination and inline interferometry.
Fig. 3
Fig. 3 Simulated amplitude transfer functions (ATF) for rotating illumination under different incident angles. The ATF is rotationally symmetric around the kz axis.
Fig. 4
Fig. 4 Fourier transforms (kx-kz cross-sections) of a human cheek cell data set: (a) Raw data for axial illumination. (b) Raw data for rotating illumination. (c) Recovered object information for axial illumination after applying the phase stepping algorithm. (d) Recovered object information for rotating illumination after applying the phase stepping algorithm and (e) after rotational averaging around the kz axis.
Fig. 5
Fig. 5 Recovered phase image of groups of 100nm silica microspheres under (a) axial illumination and (b) under rotating illumination.
Fig. 6
Fig. 6 Human cheek cell as imaged in brightfield microscopy (a) and phase contrast microscopy (b). (c) Recovered phase information using rotating illumination. (d) Recovered phase information in an x-z plane. The image data is equally sampled in the x and z direction. (e) Magnified version of the boxed area in (d).

Equations (8)

Equations on this page are rendered with MathJax. Learn more.

K=kki
I=|R+O|2=RR*+OO*+RO*+R*OR=r0ei(φ+k0ω)
I˜=R˜×R˜+O˜×O*˜+R˜×O˜+R*˜×O˜R˜=r0δ(kk0)eiφ
Or0=(eiφ1I1+eiφ2I2+eiφ3I3+eiφ4I4)*/4φ1=0;φ2=π/2;φ3=π;φ4=3π/2;
I1=|R1+O|2=|R1|2+|O|2+R1O*+OR1*;I2=|R2+O|2;I3=|R3+O|2;I4=|R4+O|2;Ri=r0eiφi
I1+eiφ2I2+eiφ3I3+eiφ4I4
4r0O*+(Or0)(1+e2iφ2+e2iφ3+e2iφ4)_+(|r0|2+|O|2)(1+eiφ2+eiφ3+eiφ4)_
φ2=π/2;φ3=π;φ1=3π/2
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.