Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Phase from chromatic aberrations

Open Access Open Access

Abstract

We show that phase objects may be computed accurately from a single color image in a brightfield microscope, with no hardware modification. Our technique uses the chromatic aberration that is inherent to every lens-based imaging system as a phase contrast mechanism. This leads to a simple and inexpensive way of achieving single-shot quantitative phase recovery by a modified Transport of Intensity Equation (TIE) solution, allowing real-time phase imaging in a traditional microscope.

©2010 Optical Society of America

1. Introduction

Light is a wave, having both an amplitude and a phase; however, optical phase is too fast to be detected directly. Thus, phase differences must be computed from intensity measurements. Phase changes carry important information about an object’s shape and refractive index and are important in surface profiling, adaptive optics and particularly biomedical visualization, where samples are often transparent. Phase objects have been observed since microscopists in the early 1900s recognized that defocusing an image of a transparent object, such as a cell, renders visible previously invisible structure variations [1]. Indeed, in-focus imaging systems (with no aberrations) have a purely real transfer function and thus no phase contrast. Defocus introduces an imaginary component [2], as do most optical aberrations [3], converting some phase information into intensity variations. However, the phase information through such means is neither in-focus nor quantitative, yielding only qualitative descriptions. Phase contrast microscopy [4] solved the problem of providing in-focus phase contrast, but in the resulting images, phase still cannot be separated from absorption and so the method is not quantitative. The invention of the laser enabled many of today’s traditional coherent interference techniques for measuring phase quantitatively with high accuracy and at fast speeds [57]. However, interferometric systems are often bulky, suffer unwrapping problems, and are resolution-limited by the coherent diffraction limit. For applications with partially coherent illumination, Shack-Hartmann sensors [8] are popular where spatial resolution is not crucial, and a recent method allows higher resolution [9]. Here, we propose a way to use the chromatic dispersion of an imaging system as a mechanism for obtaining phase directly [10].

When white light interacts with a dielectric medium, different wavelengths propagate at different speeds, giving rise to chromatic dispersion [11]. For years, scientists have worked to reduce the resulting ‘chromatic aberration’ in imaging systems by using long focal lengths, reflective optics or achromatic lenses [12].

Our technique is a variation of Transport of Intensity [13] equation (TIE) imaging, which is able to produce accurate quantitative phase reconstructions with partially coherent light [14], right out to the diffraction limit [15] and without the need for unwrapping. The TIE states that the derivative of intensity with respect to the optical axis, z, is related to the phase. In the partially coherent case, the term “phase” refers to a ‘generalized phase’ [14] which is the average phase delay incurred by the spectrally-weighted mean wavelength and direction, in the case of a uniform, circular source [16], such as that of a brightfield microscope. By taking two images at different z positions, the generalized phase of a partially coherent incident beam may be recovered [14]. TIE imaging has been applied in many different applications [1618] and extended to multiple images [19]. Limitations include the requirement for either a beamsplitter with two cameras, or motion between sequential capture of multiple images, slowing down the process. Registration of the images requires careful alignment, as well as ensuring that no tilt is introduced between the images [20].

2. Theory

The TIE is derived from the paraxial wave equation after defocus, which is described by Fresnel propagation [21]. Wavelength λ and distance z are interchangeable in the propagation equation [22] – in fact, defocus is the product of the two ξ = λz. Thus, we find (see Appendix):

I(x,y)ξ=[I(x,y)φ(x,y)]2π,
where I(x,y) is intensity, φ(x,y) is phase, and is the two-dimensional gradient operator in the lateral dimensions. One implementation of this equation allows a defocused phase to be recovered by taking images with different wavelengths in a single plane of constant z, similar to that proposed for X-ray imaging with quantified dispersion [23,24] and diffraction tomography [25]. Similar to the traditional TIE, strict temporal coherence is not required and λ may be replaced by the spectrally-weighted mean wavelength when φ(x,y) is the generalized phase of a partially coherent beam [14]. Thus, the broad color filters that comprise the RGB channels of any color camera may be used to capture the differentially defocused intensity images simultaneously, allowing single-shot phase imaging free of lateral or tilt registration problems. Furthermore, we describe below a method for in-focus phase imaging with optimal contrast by introducing a wavelength-dependent z. We assume here a thin phase object with negligible object dispersion.

Given a measurement of ∂I/∂ξ, Eq. (1) is solved directly for φ(x,y) using standard Poisson solvers [26]. Here, we choose a Fourier Domain Poisson solver for its speed (order N2logN) and tolerance to boundary condition errors [27]. When amplitude variations or non-uniform illumination are present, obtaining φ(x,y) from Eq. (1) requires two Poisson solutions according to the method of Teague [13]. Since the measurement is related to the second derivative of the phase distribution, steep phase gradients will not suffer aliasing problems as easily as with interferometry, where phase ‘wraps’ every 2π. By using parallel FFT processing on a Graphics Processing Unit (GPU), the phase reconstructions can be computed and displayed in real-time at camera-limited frame rates [28].

The phase recovered will be that at the camera plane, yielding a defocused result for constant z. We desire an in-focus phase image, where two colors are over and under focused by the same amount, making the result accurate to second order. Below we outline the method for designing such an imaging system with tuneable defocus.

We use either a 4f system (see Fig. 1a ) or a brightfield microscope with an infinity-corrected objective lens. Refractive imaging systems usually attempt to correct the effects of chromatic aberration with compound achromatic lenses [12]. Here, we refer to the phenomenon as ‘chromatic defocus’ and explicitly utilize it for obtaining phase information. Chromatic defocus has been used previously for depth sectioning in a microscope [29,30].

 figure: Fig. 1

Fig. 1 Design of a chromatic 4f system for differential defocus of colour channels. a) Chromatic defocus causes different colors to focus in different planes along the optical axis, b) quantification of chromatic defocus for three values of f 2 given f 1 = 200mm (at 532nm) with BK7 lens dispersion.

Download Full Size | PDF

For a single thin lens, using the Lens maker’s formula [11], the shift in focal length f at wavelength λ from the design wavelength λ0 is

Δf(λ)=n(λ)n(λ0)(n(λ)1)f(λ0),
where n0) and n(λ) are the lens refractive indices at λ0 and λ, respectively. In a 4f system designed for λ0, the wavelength-dependent axial focal shift is
Δf'(λ)=Δf2(λ)+f2(λ)2Δf1(λ)+Δf2(λ)(f1(λ)2/Δf1(λ)),
where Δf1(λ) and Δf2(λ) are the focal length shifts of the first and second lenses, respectively, f1(λ)=f1(λ0)Δf1(λ) and f2(λ)=f2(λ0)Δf2(λ). The total wavelength-dependent defocus in the 4f system is thus ξ(λ)=λΔf'(λ) and is plotted in Fig. 1b. Note that defocus is nearly linear with wavelength, allowing for a well-centred measurement of the intensity derivative, with green in-focus and red and blue defocused by equal and opposite amounts; this centered derivative measurement is accurate to 2nd order. Furthermore, we can choose f 10) and f 20) (or the dispersion of the lens material) to tune the slope of this curve such that the optimal defocus exists for a given object. It should be noted that the colors also incur slightly different spherical aberration, which should be minimized by using a field-of-view smaller than the focal length, or specialty gradient index or diffractive lens systems.

Since we anticipate a key application for this technique to be live cell microscopy of fast dynamics, we look next at a microscope imaging system, where an infinity-corrected objective lens is used in place of a 4f system. Our technique will be more accurate with higher spatial coherence [14], achieved by stopping down the condenser aperture; however, opening the condenser aperture will improve diffraction-limited resolution [15].

3. Experimental results

A MEMS deformable mirror (DM) array [31] developed for adaptive optics provides a well-characterized dynamic phase object. Using the setup shown in Fig. 2a , the generalized phase of a DM with 16 actuators addressed was imaged in reflection mode (see Fig. 2b-d). The 4f system parameters were f 1 = 200mm, f 2 = 75mm, and a standard Bayer color camera (Edmund Optics 3112c) was used. We find that the results are not degraded by the shifted sampling in the Bayer pattern, provided that the resolution of the camera exceeds the maximum resolution desired. More expensive 3CCD cameras and Foveon sensors would provide perfect lateral sampling; however, the spectral width of the Foveon color channels is much larger than that of the Bayer filter, giving a more noise-sensitive result. The green color channel is in focus (no phase contrast), while the red and blue channels are under and over focused, respectively (Fig. 2b). From this single color image (Fig. 2c), the solution of Eq. (1) provides a quantitative map of the generalized phase, which is then reinterpreted as height (height=φλ/2πΔn).The mirror was reconfigured dynamically and the height variations were captured in real-time (see Media 1), demonstrating the capability for fast high-resolution wavefront sensing.

 figure: Fig. 2

Fig. 2 a) Experimental setup for deformable mirror experiments. b-d) Phase retrieval from a single colour image. b) Red, green and blue colour channels from c) captured colour image. d) Phase solution giving inverse height profile across the mirror (video in Media 1).

Download Full Size | PDF

Transmission brightfield microscope images and video are shown in Fig. 3 (Media 2), taken with a Nikon TE2000-U transmission microscope and an achromatic objective (20x, NA = 0.4). The top row shows the captured color images and the bottom row shows the resulting phase maps. The PMMA test object demonstrates the capabilities of our system for reconstruction of sharp surface profile gradients with nanometer-scale accuracy. The live cell samples are adult human dermal microvascular endothelial cells (HMVEC) in EGM-2MV growth medium and live HeLa cells, where results will depend on both refractive index and shape. As expected, the color images show little contrast, except for some color splitting at the sharp edges; however, the small differences between the color channels are enough to recover a robust quantitative phase map.

 figure: Fig. 3

Fig. 3 Phase retrieval in a brightfield microscope. (Top row) Color images, (bottom row) recovered phase for (left) PMMA test object height, (middle) live HMVEC cells (also see Media 2), and (right) live HeLa cells (normalized phase).

Download Full Size | PDF

4. Discussion

4.1 Accuracy of phase retrieval

Equation (1) is well-posed and invokes only the paraxial approximation; however, the measurement of the intensity derivative is necessarily a finite difference approximation:

IξI(ξR)I(ξB)Δξ+NRNBΔξ,
where NR and NB are the noise values in the red and blue color channels, respectively, ξR and ξB are the corresponding defocus parameters, and Δξ=ξRξB. The derivative measurement becomes unstable at small defocus due to amplification of noise by (Δξ)−1. Increasing Δξ provides better signal to noise ratio (SNR) in the derivative estimate; however, the linearity assumption inherent to the finite difference approximation is compromised when defocus is large [19], causing nonlinearity error. An example plot of the root mean squared (RMS) error in the recovered phase for a pure-phase test object is shown in Fig. 4 , where the algorithm used intensity images having simulated defocus due to varying values of wavelength and distance. Since the error is object-dependent, we have attempted to create a general test object by using uniformly-distributed random phase values within a square boundary (see Fig. 4(inset)). The error is due to the nonlinearity in the finite derivative approximation. As expected, error increases with increasing defocus Δξ=ΔλΔz. and approaches zero for small defocus in this noise-free case. Thus, the noise floor will limit the accuracy of the system in a nonlinear way. To minimize nonlinearity error, we require Δξx2, where x is the characteristic size of a feature to be reconstructed.

 figure: Fig. 4

Fig. 4 Nonlinearity error of a test phase object (inset) for varying values of wavelength and distance-dependent defocus. In the absence of noise, the error goes asymptotically to zero with decreasing defocus. Thus, the noise floor will determine the accuracy of the system.

Download Full Size | PDF

As verification of the accuracy of our technique, we compare our results to those obtained by a commercial interferometer in Fig. 5 . The result is very similar, and we find that with very steep phase gradients, the commercial interferometer tends to acquire phase unwrapping artifacts that our technique does not.

 figure: Fig. 5

Fig. 5 Comparison of results from our technique with profilometer data from a commercial interferometer. (a) Height map from Zygo interferometer compared to (b) height map from our technique (µm). (c) Cross-section along one actuator of the DM array, showing the ‘influence function’ of the DM using both techniques.

Download Full Size | PDF

4.2 Imaging with achromats

Most microscope objectives are achromatic or apochromatic, meaning that they correct for chromatic defocus at two or more wavelengths. Achromatic objectives, which focus red and blue (but not green) wavelengths to the same position, are suitable when ∂I/∂ξ is measured as follows:

Iξ(IR+IB)2IG2Δξ,
where R, G and B denote the red, green and blue color channels and Δξ is the amount of defocus between the red/blue channels and the green channel (see Fig. 6 ). Conveniently, the green color channel, which occupies twice the area of the total sensor in a Bayer pattern, will have half the noise variance of each of the red and blue channels. Therefore, by averaging the red and blue channels, we achieve balanced noise performance. Again, the defocus should correspond to a centered derivative measure with the red/blue channels over-focused and the green channel under-focused by the same amount.

 figure: Fig. 6

Fig. 6 Phase imaging with achromats. (a) Typical defocus-wavelength plot for red-blue achromat, (b) phase result using Eq. (1) (very poor contrast) compared to (c) Eq. (5) with achromat.

Download Full Size | PDF

4.3 Limitations

The limitation of our technique is that material dispersion within the object will cause wavelength-dependent phase shifts through the object, which could cause artifacts in the phase result. Reflective objects such as the DM will not have these problems, and we have not found this problem in any samples in practice. For example, in the MIT test object, we can roughly calculate the expected erroneous phase shift due to material dispersion in PMMA, which has a refractive index n at the mean color channel wavelengths of nR = 1.48858, nG = 1.49423 and nB = 1.50019. Thus, the phase shift errors will be approximately 0.4% of the total phase shift, or less than 1nm for the object in Fig. 3.

For the biological samples, we verify negligible material dispersion by comparing our result with traditional TIE imaging (Fig. 7 ). The traditional TIE result was obtained from two oppositely focused images taken by moving the sample between captures by a distance which gives approximately equivalent defocus as that which occurs between the color channels, but using data from the green color channel only. The difference map for the two phase solutions (Fig. 7c) shows no obvious systematic differences where the cells are and is mainly a result of the differential noise between the images, giving a cloudy effect. Some small differences at the edges of the cells could be a result of imperfect image registration between the two intensity images in the traditional TIE reconstruction, or due to movement of the cells between the two images captured. Thus, material dispersion is not a problem in these results.

 figure: Fig. 7

Fig. 7 Comparison of phase results from our technique with traditional TIE. (a) Phase from traditional TIE using two images (radians), (b) phase from our technique using a single color image (radians) and (c) the difference between the two results (radians).

Download Full Size | PDF

Finally, we have assumed throughout this work that the wavefronts associated with the three colors can be approximated as coming from a single point source, so as not to induce significant errors due to multiple coherent modes [32].

5. Conclusion

We have proposed and demonstrated a new method for high-resolution quantitative phase imaging that is inexpensive, fast and accurate. Through a mathematical manipulation of the original TIE equation, the mechanical axial defocus is transformed into chromatic aberration. In this way, no physical movement is needed and only a standard color camera is required. Since chromatic aberration is inherent in any refractive imaging system with a broadband source, our technique turns a negative element of the image formation process into a positive and direct means of phase retrieval. Furthermore, quantitative phase can be captured and computed at camera-limited speeds in a conventional brightfield microscope, which has great potential applications in imaging of dynamic and interactive biological processes, as well as for fast, high-resolution wavefront sensing in the field of adaptive optics and large-scale density measurements. In developing this technique, we have opened the door to a new perspective on chromatic dispersion in imaging systems, and described how to use it to gain quantitative information. The proposed treatment turns any conventional optical microscope into a real-time phase imaging device with low cost and no hardware modification.

Appendix: Derivation of Eq.1

Start with a complex object ψ(x,y)=A(x,y)eiφ(x,y) where A(x,y) is amplitude, φ(x,y) is phase and x,y denote the lateral dimensions. Assuming plane wave illumination, the field after the object is ψ(x,y) and propagates along the optical axis z according to the Fresnel propagation kernel [21],

h(x,y;ξ)=ei2πz/λiξexp{iπξ(x2+y2)}.

At a defocus of ξ, the recorded intensity is,

I(x,y;ξ)=|ψ(x,y)h(x,y;ξ)|2=|1{Ψ(u,v)H(u,v;ξ)}|2,
where 1 denotes 2D inverse Fourier Transform, u,v are the spatial frequency variables in Fourier space, and Ψ(u,v)and H(u,v;ξ) are the 2D Fourier Transforms of ψ(x,y) and h(x,y;ξ), respectively. We Taylor expand H(u,v;ξ)and linearize with respect to ξ:

H(u,v;ξ)=1iπξ(u2+v2)(πξ)2(u2+v2)22!...1iπξ(u2+v2).

Then, taking into account i2π(u+v), we follow a derivation parallel to that of Beleggia [20] and find after some algebra,

I(x,y;ξ)=|ψ(x,y)+iξ2ψ(x,y)4π|2=I0(x,y)ξ2π(I0(x,y)φ(x,y)),
where I0(x,y) is the in-focus intensity. In the limit of small defocus (ξ0), this leads to:

I(x,y;ξ)I0(x,y)ξIξ=[I0(x,y)ϕ(x,y)]2π.

Acknowledgements

The authors thank Boston Micromachines, Inc. for use of the DM array, S. Yang for fabrication of the test object, N. Loomis for GPU code, Prof. Roger D. Kamm’s Lab and the Spectroscopy Lab at MIT for cell samples, the Boston University Photonics Centre for use of the commercial interferometer, and the anonymous reviewers for constructive criticism on the original manuscript. We also gratefully acknowledge the Singapore-MIT Alliance for Research and Technology (SMART) Centre for financial support.

References and links

1. F. Zernike, “How I discovered phase contrast,” Science 121(3141), 345–349 (1955). [CrossRef]   [PubMed]  

2. C. J. R. Sheppard, “Defocused transfer function for a partially coherent microscope and application to phase retrieval,” J. Opt. Soc. Am. A 21(5), 828–831 (2004). [CrossRef]  

3. H. Hopkins, “The frequency response of a defocused optical system,” Proc. Royal Soc. London, Ser. A 231(1184), 91–103 (1955). [CrossRef]  

4. F. Zernike, “Phase contrast, a new method for the microscopic observation of transparent objects,” Physica 9(7), 686–698 (1942). [CrossRef]  

5. P. Marquet, B. Rappaz, P. J. Magistretti, E. Cuche, Y. Emery, T. Colomb, and C. Depeursinge, “Digital holographic microscopy: a noninvasive contrast imaging technique allowing quantitative visualization of living cells with subwavelength axial accuracy,” Opt. Lett. 30(5), 468–470 (2005). [CrossRef]   [PubMed]  

6. G. Popescu, L. P. Deflores, J. C. Vaughan, K. Badizadegan, H. Iwai, R. R. Dasari, and M. S. Feld, “Fourier phase microscopy for investigation of biological structures and dynamics,” Opt. Lett. 29(21), 2503–2505 (2004). [CrossRef]   [PubMed]  

7. J. Millerd, N. Brock, J. Hayes, M. North-Morris, B. Kimbrough, and J. Wyant, Fringe 2005: Pixelated Phase-mask Dynamic Interferometers (Springer, 2003).

8. B. C. Platt and R. Shack, “History and principles of Shack–Hartmann wavefront sensing,” J. Refract. Surg. 17(5), S573–S577 (2001). [PubMed]  

9. X. Cui, J. Ren, G. J. Tearney, and C. Yang, “Wavefront image sensor chip,” Opt. Express 18(16), 16685–16701 (2010). [CrossRef]   [PubMed]  

10. L. Waller, and G. Barbastathis, “Phase from Defocused Color Images,” in Frontiers in Optics, OSA Technical Digest (CD) (Optical Society of America, 2009), paper FThR3.

11. M. Born, and E. Wolf, Principles of Optics (Cambridge Univ. Press, 1999).

12. H. King, The History of the Telescope (Dover, 2003).

13. M. Teague, “Deterministic phase retrieval: a Green’s function solution,” J. Opt. Soc. Am. A 73, 1434–1441(1983). [CrossRef]  

14. D. Paganin and K. Nugent, “Noninterferometric phase imaging with partially coherent light,” Phys. Rev. Lett. 80(12), 2586–2589 (1998). [CrossRef]  

15. E. D. Barone-Nugent, A. Barty, and K. A. Nugent, “Quantitative phase-amplitude microscopy I: optical microscopy,” J. Microsc. 206(Pt 3), 194–203 (2002). [CrossRef]   [PubMed]  

16. N. Streibl, “Phase imaging by the transport of equation of intensity,” Opt. Commun. 49(1), 6–10 (1984). [CrossRef]  

17. L. Waller, Y. Luo, S.-Y. Yang, and G. Barbastathis, “Transport of intensity phase imaging in a volume holographic microscope,” Opt. Lett. 35(17), 2961–2963 (2010). [CrossRef]   [PubMed]  

18. S. S. Kou, L. Waller, G. Barbastathis, and C. J. Sheppard, “Transport-of-intensity approach to differential interference contrast (TI-DIC) microscopy for quantitative phase imaging,” Opt. Lett. 35(3), 447–449 (2010). [CrossRef]   [PubMed]  

19. L. Waller, L. Tian, and G. Barbastathis, “Transport of Intensity phase-amplitude imaging with higher order intensity derivatives,” Opt. Express 18(12), 12552–12561 (2010). [CrossRef]   [PubMed]  

20. M. Beleggia, M. Schofield, V. Volkov, and Y. Zhu, “On the transport of intensity technique for phase retrieval,” Ultramicroscopy 102, 37–49 (2004). [CrossRef]   [PubMed]  

21. J. Goodman, Introduction to Fourier Optics, (McGraw-Hill, 1996).

22. B. Saleh, and M. Teich, Fundamentals of Photonics (John Wiley & Sons, 2010).

23. T. Gureyev and S. Wilkins, “On X-ray phase retrieval from polychromatic images," Opt. Commun. 147, 229–232 (1998) (Erratum: Opt. Commun. 154, 391). [CrossRef]  

24. T. E. Gureyev, S. Mayo, S. W. Wilkins, D. Paganin, and A. W. Stevenson, “Quantitative in-line phase-contrast imaging with multienergy X rays,” Phys. Rev. Lett. 86(25), 5827–5830 (2001). [CrossRef]   [PubMed]  

25. M. A. Anastasio, Q. Xu, and D. Shi, “Multispectral intensity diffraction tomography: single material objects with variable densities,” J. Opt. Soc. Am. A 26(2), 403–412 (2009). [CrossRef]  

26. G. Strang, Computational Science and Engineering (Wellesley-Cambridge Press, 2010).

27. L. Allen and M. Oxley, “Phase retrieval from series of images obtained by defocus variation,” Opt. Commun. 199(1-4), 65–75 (2001). [CrossRef]  

28. N. Loomis, L. Waller, G. Barbastathis, “High-speed phase recovery using chromatic transport of intensity computation in graphics processing units,” Proc. Digital Holography meeting of the OSA: JMA7 (2010).

29. G. Molesini and F. Quercioli, “Pseudocolor effects of longitudinal chromatic aberration,” J. Opt. (Paris) 17, 279–282 (1986).

30. J. S. Courtney-Pratt and R. L. Gregory, “Microscope with enhanced depth of field and 3-D capability,” Appl. Opt. 12(10), 2509–2519 (1973). [CrossRef]   [PubMed]  

31. T. Bifano, R. K. Mali, J. K. Dorton, J. A. Perreault, N. Vandelli, M. N. Horenstein, and D. A. Castanon, “Continuous-membrane silicon deformable mirror,” Opt. Eng. 36, 1354–1360 (1997). [CrossRef]  

32. A. M. Zysk, R. W. Schoonover, P. S. Carney, and M. A. Anastasio, “Transport of intensity and spectrum for partially coherent fields,” Opt. Lett. 35(13), 2239–2241 (2010). [CrossRef]   [PubMed]  

Supplementary Material (2)

Media 1: MPEG (1863 KB)     
Media 2: MPEG (1076 KB)     

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (7)

Fig. 1
Fig. 1 Design of a chromatic 4f system for differential defocus of colour channels. a) Chromatic defocus causes different colors to focus in different planes along the optical axis, b) quantification of chromatic defocus for three values of f 2 given f 1 = 200mm (at 532nm) with BK7 lens dispersion.
Fig. 2
Fig. 2 a) Experimental setup for deformable mirror experiments. b-d) Phase retrieval from a single colour image. b) Red, green and blue colour channels from c) captured colour image. d) Phase solution giving inverse height profile across the mirror (video in Media 1).
Fig. 3
Fig. 3 Phase retrieval in a brightfield microscope. (Top row) Color images, (bottom row) recovered phase for (left) PMMA test object height, (middle) live HMVEC cells (also see Media 2), and (right) live HeLa cells (normalized phase).
Fig. 4
Fig. 4 Nonlinearity error of a test phase object (inset) for varying values of wavelength and distance-dependent defocus. In the absence of noise, the error goes asymptotically to zero with decreasing defocus. Thus, the noise floor will determine the accuracy of the system.
Fig. 5
Fig. 5 Comparison of results from our technique with profilometer data from a commercial interferometer. (a) Height map from Zygo interferometer compared to (b) height map from our technique (µm). (c) Cross-section along one actuator of the DM array, showing the ‘influence function’ of the DM using both techniques.
Fig. 6
Fig. 6 Phase imaging with achromats. (a) Typical defocus-wavelength plot for red-blue achromat, (b) phase result using Eq. (1) (very poor contrast) compared to (c) Eq. (5) with achromat.
Fig. 7
Fig. 7 Comparison of phase results from our technique with traditional TIE. (a) Phase from traditional TIE using two images (radians), (b) phase from our technique using a single color image (radians) and (c) the difference between the two results (radians).

Equations (10)

Equations on this page are rendered with MathJax. Learn more.

I ( x , y ) ξ = [ I ( x , y ) φ ( x , y ) ] 2 π ,
Δ f ( λ ) = n ( λ ) n ( λ 0 ) ( n ( λ ) 1 ) f ( λ 0 ) ,
Δ f ' ( λ ) = Δ f 2 ( λ ) + f 2 ( λ ) 2 Δ f 1 ( λ ) + Δ f 2 ( λ ) ( f 1 ( λ ) 2 / Δ f 1 ( λ ) ) ,
I ξ I ( ξ R ) I ( ξ B ) Δ ξ + N R N B Δ ξ ,
I ξ ( I R + I B ) 2 I G 2 Δ ξ ,
h ( x , y ; ξ ) = e i 2 π z / λ i ξ exp { i π ξ ( x 2 + y 2 ) } .
I ( x , y ; ξ ) = | ψ ( x , y ) h ( x , y ; ξ ) | 2 = | 1 { Ψ ( u , v ) H ( u , v ; ξ ) } | 2 ,
H ( u , v ; ξ ) = 1 i π ξ ( u 2 + v 2 ) ( π ξ ) 2 ( u 2 + v 2 ) 2 2 ! ... 1 i π ξ ( u 2 + v 2 ) .
I ( x , y ; ξ ) = | ψ ( x , y ) + i ξ 2 ψ ( x , y ) 4 π | 2 = I 0 ( x , y ) ξ 2 π ( I 0 ( x , y ) φ ( x , y ) ) ,
I ( x , y ; ξ ) I 0 ( x , y ) ξ I ξ = [ I 0 ( x , y ) ϕ ( x , y ) ] 2 π .
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.