Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Projected fringe profilometry using a liquid-crystal spatial light modulator to extend the depth measuring range

Open Access Open Access

Abstract

An approach using a liquid-crystal spatial light modulator (LC-SLM) to enlarge the depth measuring range of the projected fringe profilometry is presented. This approach is especially applicable to detect dynamic objects with micro-scale sizes. Compared with a typical 2D image system, the LC-SLM provides a better performance for a 3D shape sensing system. The main advantages include (1) a much higher allowance to increase in the depth measuring range, (2) easiness to compensate perspective distortion and geometric distortion, (3) very high accuracy (in the micron-range) and (4) only one phase measurement needed for operation.

©2011 Optical Society of America

1. Introduction

Wave-front coding by a cubic phase mask is a well known algorithm to extend the depth of field (DOF) of an incoherent optical system [14]. With this cubic phase mask, the point-spread function is insensitive to defocus. By de-convoluting the observed image which is out of focus, an un-blurred image can be retrieved. With a proper strength of the phase mask, a ten times increase in the DOF can be achieved. Compared with other techniques [57], it is notable to extend the depth measuring range without seriously reducing the optical power and image resolution.

The DOF is increased as the strength of the phase mask is increased. Unfortunately, increasing the mask strength reduces the point-spread function at high frequencies, resulting in a lower spatial resolution on the retrieved image. There is a trade-off between the extended DOF and the image resolution. The task to optimize the extended DOF by choosing a proper strength of the phase mask is somewhat time-consuming. Recently, Hong et al. [8] presented a flexible DOF system using a liquid-crystal spatial light modulator (LC-SLM). With the proper choice of the angle of the polarizer and analyzer in the front and the back of the LC-SLM, it could serve as a cubic phase mask. The strength of the phase mask was tunable by the external video signal. Thus, it provided a convenient way to identify the associated mask strength. However, accuracy was limited by the pixel resolution of the LC-SLM. Additional modulation errors also occurred due to the nonlinear signal response and the inconstant intensity output.

The above techniques were developed to cope with 2-D image problems. For a 3D shape sensing system, it is still necessary to extend the DOF. For example, projected fringe profilometry [9] is a powerful tool to describe the 3D shape of the inspected object. Its depth measuring range is confined by the depth of field of the image acquisition system and the depth of focus of the fringe projection system. To inspect micro-scale objects, microscopes are usually adopted both into the image acquisition system and the fringe projection system [10, 11]. The depth measuring range is therefore limited by the DOF of the microscopes. Our previous work has shown that supercontinuum illumination can be employed to extend the DOF of the fringe projection system [12]. However, this approach can only be applied to objects with large-scale sizes. When the microscope is employed for detection, it is still required to develop an approach to enlarge the DOF of the image acquisition system.

In this paper, we investigate an approach based on the LC-SLM to enlarge the depth measuring range for the projected fringe profilometry. A microscope combined with a wide-angle eyepiece lens is employed to project a fringe pattern onto the inspected surface. At a different viewpoint, a CCD camera observes the projected fringes through another microscope and a cubic phase mask. Phase of the fringes is extracted by the Fourier transform method [9]. With the presented calibration schemes [13, 14], depth information can be determined from the phase of the fringes. The cubic phase mask enlarges the depth of field of the image acquisition system, while the wide-angle eyepiece lens increases the depth of focus of the fringe projection system. It is found that the depth measuring range could be extended up to 1600μm, even though the DOF of the microscope was only 80μm.

Compared with 2D image problems, the depth measuring range has been greatly extended. For example, the LC-SLM extended the DOF of the microscope from 80μm to 600μm, but the details of the retrieved image had been damaged. In our 3D shape measurements, even though the depth measuring range had been extended up to 1600μm, the systematic accuracy could still be very high, in the micron-range accuracy. In addition, it is found that the limitations of the LC-SLM (i.e., low pixel resolution, nonlinear signal response, and inconstant intensity output) are not significant in 3D shape measurements. Noises caused by the LC-SLM could be eliminated by a band-pass filter. Moreover, the cubic phase mask increases the system tolerance to focus related aberrations, such as field curvature, spherical aberration, and chromatic aberration, but it is difficult to simultaneously reduce perspective distortion and geometric distortion. In our 3D shape measurements, such kind of errors could be compensated very well by the calibration scheme. Thus, a high-accuracy, non-scanning, full-field 3D shape sensing system with a very large depth measuring range is realized.

2. Theory

2.1 Extending the DOF using the cubic phase mask

Figure 1 shows an incoherent optical system. All imaging elements are lumped into a single black box. Its effective focal length is denoted as f. An exit pupil is located at x-y plane. A planar object with intensity Io(xo, yo) is placed a distance d o in front of the entrance pupil. At a distance di behind the exit pupil there appears an image Ii(xi, yi).

 figure: Fig. 1

Fig. 1 Configuration of an incoherent system.

Download Full Size | PDF

The intensity distribution on the image plane (xi, yi) can be expressed as [15]

Ii(xi,yi,ψ)=|h(xi,yi,ψ;xo,yo)|2Io(xo,yo)dxodyo,
where |h(xi,yi,ψ;xo,yo)|2 is the point spread function. ψ is the defocus parameter, which is given by

ψ=πλ(1do+1di1f).

For a space-invariant system, the point spread function can be represented as

h(xi,yi,ψ;xo,yo)=h(xiMxo,yiMyo,ψ)=Aλ2didoP(x,y)exp[jψ(x2+y2)]exp{j2πλdi[(xiMxo)x+(yiMyo)y]}dxdy,
where M is the magnitude of the system, λ is the wavelength of the light, and P is the square pupil function. The optical transfer function (OTF) of the incoherent system is the Fourier transform of the point spread function, as given by
H(fx,fy,ψ)=[P(τx+fx/2,τy+fy/2)ejk2[(τx+fx/2)2+(τy+fy/2)2]ψ×P*(τxfx/2,τyfy/2)ejk2[(τxfx/2)2+(τyfy/2)2]ψ]dτxdτy,
where “*” is the complex conjugate operation, and fx and fy are the spatial frequencies. The intensity of the image can therefore be carried out by the convolution theorem,
{Ii(xi,yi,ψ)}=H(fx,fy,ψ){Io(xo,yo)},
where{}denotes the Fourier transform operation. Our purpose is to design a specific pupil function on the square aperture so that H is not sensitive to defocus, i.e.,

H(fx,fy,ψ)H(fx,fy,0).

In such a case, the object function Io(xo, yo) can be retrieved by de-convoluting the defocused image Ii(x, ψ), as mathematical expressed as

Io(xo,yo)=1{{Ii(xi,yi,ψ)}H(fx,fy,ψ)}1{{Ii(xi,yi,ψ)}H(fx,fy,0)},
where 1{} denotes the inverse Fourier transform operation. It has been shown that H can be insensitive to defocus if the pupil function is given by [1]
P(x,y)={12exp[jα(x3+y3)],|x|L/2,|y|L/20,otherwise,
where α is a constant value, and L is the width of the square aperture. The OTF becomes

H(fx,fy,ψ){π12α1|fxfy|exp[jα4(fx3+fy3)]exp[jψ23α(fx+fy)],iffxfy01,iffxfy=0.

For a large vale of α, both ψ2fx and ψ2fy are much smaller than α. Eq. (9) can then be represented as

H(fx,fy,ψ)H(fx,fy,0)={π12α1|fxfy|exp[jα4(fx3+fy3)],iffxfy01,iffxfy=0.

Consequently, a large value of α minimizes the sensitivity of defocus. An example of the pupil function is depicted as Fig. 2 , in which the α is 40.

 figure: Fig. 2

Fig. 2 Phase profile of the pupil function.

Download Full Size | PDF

A simulation was done to illustrate how the pupil function is employed to extend the DOF. Distributions of the OTFs are shown in Fig. 3 , in which α = 90. It shows that the shape of the OTF does not vary much when ψ is changed. Thus, the DOF is increased. However, the point-spread function at high frequencies is reduced. Noises might be introduced to the retrieved image.

 figure: Fig. 3

Fig. 3 Magnitude of the optical transfer functions: (a) ψ = 10, (b) ψ = 15, and (c) ψ = 40.

Download Full Size | PDF

2.2 Using the liquid crystal spatial light modulator (LC-SLM) as the phase mask

A liquid crystal spatial light modulator (LC-SLM) combined with an analyzer and a polarizer can be employed as a phase modulator [16]. An example is shown as Fig. 4 . We used HOLOEYE LC 2002 as the LC-SLM, in which the pixel resolution was 800×600 and the pixel pitch was 32μm × 32 μm. With the proper choice of the polarization angles θp and θa, phase was modulated by the input video signal.

 figure: Fig. 4

Fig. 4 Phase modulation using the LC-SLM.

Download Full Size | PDF

However, phase modulation was not a linearly response function. Light intensity was also varied by the input signal. Shown in Fig. 5 is the measured correspondence between the modulations and the signals, in which a He-Ne laser was used as the light source. In our experiments, phase modulation could be approximately linear if the input gray level was ranging from 50 to 190. Intensity variation could be less than 2.5% on the same input range.

 figure: Fig. 5

Fig. 5 Response of the LC-SLM when θp = 170° and θa = −150: (a) Phase modulation, and (b) intensity variation.

Download Full Size | PDF

The LC-SLM modulated the phase linearly between 0 and 1.6π. Thus, phase of the pupil function should be wrapped with the 1.6π modulo operation. The residue of Fig. 3 after the modulo operation is shown in Fig. 6 . This pattern is then used to be the input video signal.

 figure: Fig. 6

Fig. 6 Distribution of the input signals ranging from level 50 to 190: the brighter one represents the stronger input signal.

Download Full Size | PDF

The LC-SLM was combined with a microscope to form an extended DOF system, as shown in Fig. 7(a) . A square aperture with size of 15mm×15mm was located on the LC-SLM, so that Eq. (8) was applicable to this system. A simple way to identify a suitable α value for this system is illustrated as Fig. 7(b). A halogen lamp was selected as the incoherent light source. Lightwaves were launched into a pin hole to generate a point source. For a suitable value of α, the OTF obtained by the CCD camera should be similar to those shown on Fig. 2, and should be not sensitive to defocus. Once the α value was identified, this extended DOF system could be used as the image acquisition system of the projected fringe profilometry.

 figure: Fig. 7

Fig. 7 (a) Extended DOF system using a LC-SLM as the phase mask. (b) Optical configuration to observe the impulse response of the extended DOF system.

Download Full Size | PDF

3. Principle of the projected fringe profilometry

3.1 Configuration

A schematic diagram for 3D shape sensing is shown in Fig. 8 . A sinusoidal fringe pattern is illuminated by the incoherent light source and projected by the microscope onto the inspected object. A wide-angle lens is employed as the eyepiece lens. Thus, the depth of focus of the fringe projection system is enlarged. The reflected wavefront of the projected fringes is collected by another microscope and encoded by the cubic phase mask. This phase mask is formed by a LC-SLM combined with an analyzer and a polarizer. The encoded wavefront is observed by a monochrome CCD camera. Phases of the projected fringes are evaluated by Fourier transform method [8]. Unwrapping is then performed to reconstitute the continuous phase distribution. We used Goldstein’s algorithm [17] to restore the absolute phases.

 figure: Fig. 8

Fig. 8 Schematic setup of a projected fringe profilometric system using the phase mask.

Download Full Size | PDF

3.2 Calibration

Our previous work [13] has shown that correspondence between the unwrapped phaseφ and the depth position z can be determined by

z(φ)=n=0Ncnφn,
where cn are coefficients of the polynomial. To identify these coefficients, a flat plate was selected as the inspected sample. As shown in Fig. 9(a) , this plate was mounted on a translation stage and could be moved along the z-axis. The CCD camera recorded the projected fringes on the flat surface. Phase measurements were repeated as the plate was successively translated to different depth position (at planes z = z 1, z 2, …, zi). Hence, a series of unwrapped phases and the associated depths were then obtained at each pixel location. With a curve fitting algorithm, coefficients in Eq. (11) could be determined.

 figure: Fig. 9

Fig. 9 Systematic calibration: (a) depth calibration, and (b) lateral calibration.

Download Full Size | PDF

Note that the LC-SLM increased the DOF of the image acquisition system. Thus, the calibrated depth range (i.e., distance between z 1 and zi) was enlarged.

The transverse positions x and y can also be expressed as functions of z, as given by

{x=a1z+aoy=b1z+bo,
where a1, ao, b1, and bo are undetermined coefficients. To identify these coefficients, two 1D sinusoidal patterns with their fringes parallel to the x and the y directions were sequentially placed at planes z = z 1, z 2, …, zi. The setup is depicted in Fig. 9(b), but only a sinusoidal pattern with their fringes parallel to the x direction is shown. Again, phase measurements were performed at various depth positions. Since the period of the fringes was known, absolute phases could be converted to transverse positions. With a subsequent curve fitting process, coefficients at each pixel could be determined.

For an extended DOF system, lateral distortion due to the perspective projection of the imaging geometry might appear on the retrieved image. With the presented calibration scheme, such kind of perspective distortions could be compensated.

Once the coefficients in Eq. (11) and Eq. (12) were identified, the depth position of an inspected object could be described from the unwrapped phase. Then the corresponding transverse position could be carried out by the depth value.

4. Experiments

4.1 3D profile measurements

To identify the depth measuring range and the systematic accuracy, a flat surface with roughness of 1μm was inspected. Figure 10 depicts the inspected configuration. The x-z plane was located in the figure plane, and the y-axis was normal to the figure plane. The flat surface was mounted on a translation stage and could be moved along the z-axis.

 figure: Fig. 10

Fig. 10 Schematic setup for the inspected surface.

Download Full Size | PDF

In our setup, θo = 20° and θn = 30°. The configuration of the image acquisition system is shown as Fig. 7(a). The magnitude of the microscopes was 10. On the image-focused area, the field of view of the image acquisition system was approximately 400μm×400μm, which was mainly constricted by the square aperture on the LC-SLM. In the fringe projection system, a wide-angle lens was utilized as the eyepiece lens. Thus, the depth of focus of the fringe projection system was enlarged. Figure 11 shows the appearances of the flat surface at various depth positions, in which the phase mask had been applied. A CCD camera with 1024×1024 pixels at 12-bit dynamic range was used to record the images. The DOF of the image acquisition system was approximately 80μm. Thus, fringes were severely blurred at z = −750μm and z = 700μm. Fringe contrast also decreased with the defocus introduced by the tilt, as shown in Fig. 11(b).

 figure: Fig. 11

Fig. 11 Appearance of the recorded images at (a) z = −750μm, (b) z = 30μm, and (c) z = 700μm.

Download Full Size | PDF

The images shown as Fig. 11 were processed with Eq. (7) to retrieve the un-blurred data sets. Shown in Fig. 12 is the evaluated result. Figure 13 shows the wrapped phases extracted by the Fourier transform method. Phase fluctuated on the image-focused area. The reason came from that the inspected surface was not diffusive enough. The observed image on the focused area was not uniformly shiny. As a result, the signal-to-noise ratio on the image-focused area was weak. Errors therefore occurred on that area.

 figure: Fig. 12

Fig. 12 Retrieved images at (a) z = −750μm, (b) z = 30μm, and (c) z = 700μm.

Download Full Size | PDF

 figure: Fig. 13

Fig. 13 Phase distributions at (a) z = −750μm, (b) z = 30μm, and (c) z = 700μm.

Download Full Size | PDF

The measured 3D surface profiles at various depth positions are illustrated as Fig. 14 . The experimental result demonstrated that accuracy of the measure was very high. Even though the signal-to-noise ratio was relative low on the image-focused area, accuracy better than 5μm was achievable.

 figure: Fig. 14

Fig. 14 Retrieved 3D profiles at (a) z = −750μm, (b) z = 30μm, and (c) z = 700μm.

Download Full Size | PDF

An object with size of 2000μm×2000μm was selected as another inspected sample. This object was diffusive enough so that errors caused by phase-extraction were minimized. Appearance of the fringes projected onto the tilted sample with its focus on the central region is illustrated as Fig. 15(a) . In this setup, θo = 0° and θn = 50°. The fringe contrast decreases quickly with the defocus introduced by the tilt. The restored image processed with Eq. (7) is depicted as Fig. 15(b). Figure 16 shows the retrieved 3D surface profile. This object was also inspected by a standard scanning stylus profiler. The measurement results agreed very well, in the micron-range accuracy.

 figure: Fig. 15

Fig. 15 (a) Appearance of the projected fringes on the tilted object. (b) Retrieved image.

Download Full Size | PDF

 figure: Fig. 16

Fig. 16 Retrieved 3D profile of the tilted object.

Download Full Size | PDF

4.2 Some comments on this technique

Compared with 2D image problems, the depth measuring range has been greatly enlarged. An example is shown as Fig. 17 . The microscope used in our 3D profile measurements was employed to observe a 2D pattern. Its DOF was extended with the LC-SLM. The coefficient α of the phase mask was the same as that used in our 3D profile measurements. Appearance of the in-focused image is shown as Fig. 17(a). When this pattern was shifted with a displacement of 300μm behind the object plane, the detail of the retrieved image was damaged, as shown in Fig. 17(b). The pattern’s fine details were mainly damaged due to the low pixel resolution of the LC-SLM and low signal-to-noise ratio of the point-spread function at high frequencies. Nonlinear signal response and inconstant intensity output of the LC-SLM also made errors to retrieve the image.

 figure: Fig. 17

Fig. 17 (a) Appearance of an in-focused image through a microscope. (b) Retrieved image with a proper mask strength when the object was shifted with a displacement of 300μm behind the object plane.

Download Full Size | PDF

In our 3D shape measurements, even though the depth measuring range was extended up to 1600μm, the systematic accuracy could still be very high, in the micron-range. The reasons came from the following facts:

  • 1. Figure 3 illustrates that the mask strength reduces the point-spread function at high frequencies, but not very much at the low frequencies. Therefore, the proposed approach is especially suitable to cope with 3D shape sensing problems. In our 3D shape measurements, it plays a more important role to accurately obtain the phase information than to restore the object’s fine details. In general, the frequency of the projected fringes should be relatively low. Aliasing might occur if the frequency of the projected fringes is too high [18]. Therefore, even though the mask strength reduces the point-spread function at high frequencies, the frequency of the fringes is not reduced much, resulting in a higher allowance to extend the depth measuring range (1600μm).
  • 2. The cubic phase mask increases the system tolerance to focus related aberrations, such as field curvature, spherical aberration, and chromatic aberration, but it is difficult to reduce perspective distortion and geometric distortion. The calibration scheme presented in section 3 characterizes system distortions for each image pixel individually. Thus, the perspective distortion and geometric distortion could be effectively compensated.
  • 3. The main advantage of using the LC-SLM to be a phase mask is the convenience to identify the associated mask strength. However, the LC-SLM produces additional errors to the retrieved image. These errors are mainly caused by the low pixel resolution, nonlinear signal response, and inconstant intensity output. Fortunately, the drawback of LC-SLM is not significant in 3D shape measurements. The Fourier transform method extracts the frequency of the projected fringes with a proper band pass filter. As a result, noises on the retrieved image introduced by the LC-SLM are simultaneously eliminated by this filter.

6. Conclusion

In summary, we investigated a unique projected fringe profilometry using the LC-SLM to enlarge the depth measuring range. We kept the main advantage of using the LC-SLM (i.e. the convenience to identify the associated mask strength), and eliminated the limitations of the LC-SLM (i.e. the low pixel resolution, nonlinear signal response, and inconstant intensity output) to perform high-accurate 3D shape measurements. We also showed that the proposed approach is especially suitable to cope with 3D shape sensing problems rather than 2D problems. For example, the LC-SLM extended the DOF of the microscope from 80μm to 600μm, but the details of the retrieved image had been damaged. In our 3D profile measurements, the depth measuring range could be extended up to 1600μm, with accuracy in the micron-range. In addition, perspective distortion and geometric distortion could be effectively compensated by the presented calibration scheme. Thus, a non-scanning, full-field 3D shape sensing system with a very large depth measuring range is realized.

References and links

1. E. R. Dowski Jr and W. T. Cathey, “Extended depth of field through wave-front coding,” Appl. Opt. 34(11), 1859–1866 (1995). [CrossRef]   [PubMed]  

2. S. Bradburn, W. T. Cathey, and E. R. Dowski, “Realizations of focus invariance in optical-digital systems with wave-front coding,” Appl. Opt. 36(35), 9157–9166 (1997). [CrossRef]  

3. S. C. Tucker, W. T. Cathey, and E. R. Dowski Jr., “Extended depth of field and aberration control for inexpensive digital microscope systems,” Opt. Express 4(11), 467–474 (1999). [CrossRef]   [PubMed]  

4. K. Kubala, E. Dowski, and W. T. Cathey, “Reducing complexity in computational imaging systems,” Opt. Express 11(18), 2102–2108 (2003). [CrossRef]   [PubMed]  

5. M. Mino and Y. Okano, “Improvement in the OTF of a defocused optical system through the use of shaded apertures,” Appl. Opt. 10(10), 2219–2225 (1971). [CrossRef]   [PubMed]  

6. G. Ha¨usler, “A method to increase the depth of focus by two step image processing,” Opt. Commun. 6(1), 38–42 (1972). [CrossRef]  

7. J. Ojeda-Castaneda and L. R. Berriel-Valdos, “Zone plate for arbitrarily high focal depth,” Appl. Opt. 29(7), 994–997 (1990). [CrossRef]   [PubMed]  

8. D. Hong, K. Park, H. Cho, and M. Kim, “Flexible depth-of-field imaging system using a spatial light modulator,” Appl. Opt. 46(36), 8591–8599 (2007). [CrossRef]   [PubMed]  

9. M. Takeda and K. Mutoh, “Fourier transform profilometry for the automatic measurement of 3-D object shapes,” Appl. Opt. 22(24), 3977–3982 (1983). [CrossRef]   [PubMed]  

10. K. Körner, R. Windecker, M. Fleischer, and H. J. Tiziani, “One-grating projection for absolute three-dimensional profiling,” Opt. Eng. 40(8), 1653–1660 (2001). [CrossRef]  

11. R. Windecker, M. Fleischer, K. Körner, and H. J. Tiziani, “Testing micro devices with fringe projection and white-light interfeometry,” Opt. Lasers Eng. 36(2), 141–154 (2001). [CrossRef]  

12. W. H. Su, K. Shi, Z. Liu, B. Wang, K. Reichard, and S. Yin, “A large-depth-of-field projected fringe profilometry using supercontinuum light illumination,” Opt. Express 13(3), 1025–1032 (2005). [CrossRef]   [PubMed]  

13. H. Liu, W. H. Su, K. Reichard, and S. Yin, “Calibration-based phase-shifting projected fringe profilometry for accurate absolute 3D surface profile measurement,” Opt. Commun. 216(1-3), 65–80 (2003). [CrossRef]  

14. W. H. Su, C. Y. Kuo, C. C. Wang, and C. F. Tu, “Projected fringe profilometry with multiple measurements to form an entire shape,” Opt. Express 16(6), 4069–4077 (2008). [CrossRef]   [PubMed]  

15. J. W. Goodman, Introduction to Fourier Optics (Roberts & Company, Englewood, Colorado, USA, 2005), Chap. 5.

16. B. E. A. Saleh and K. Lu, “Theory and design of the liquid crystal TV as an optical spatial phase modulator,” Opt. Eng. 29(3), 240–245 (1990). [CrossRef]  

17. E. Zappa and G. Busca, “Comparison of eight unwrapping algorithms applied to Fourier-transform profilometry,” Opt. Lasers Eng. 46(2), 106–116 (2008). [CrossRef]  

18. L. B. Jackson, Digital Filters and Signal Processing (Toppan, 1996), Chap. 6.

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (17)

Fig. 1
Fig. 1 Configuration of an incoherent system.
Fig. 2
Fig. 2 Phase profile of the pupil function.
Fig. 3
Fig. 3 Magnitude of the optical transfer functions: (a) ψ = 10, (b) ψ = 15, and (c) ψ = 40.
Fig. 4
Fig. 4 Phase modulation using the LC-SLM.
Fig. 5
Fig. 5 Response of the LC-SLM when θp = 170° and θa = −150: (a) Phase modulation, and (b) intensity variation.
Fig. 6
Fig. 6 Distribution of the input signals ranging from level 50 to 190: the brighter one represents the stronger input signal.
Fig. 7
Fig. 7 (a) Extended DOF system using a LC-SLM as the phase mask. (b) Optical configuration to observe the impulse response of the extended DOF system.
Fig. 8
Fig. 8 Schematic setup of a projected fringe profilometric system using the phase mask.
Fig. 9
Fig. 9 Systematic calibration: (a) depth calibration, and (b) lateral calibration.
Fig. 10
Fig. 10 Schematic setup for the inspected surface.
Fig. 11
Fig. 11 Appearance of the recorded images at (a) z = −750μm, (b) z = 30μm, and (c) z = 700μm.
Fig. 12
Fig. 12 Retrieved images at (a) z = −750μm, (b) z = 30μm, and (c) z = 700μm.
Fig. 13
Fig. 13 Phase distributions at (a) z = −750μm, (b) z = 30μm, and (c) z = 700μm.
Fig. 14
Fig. 14 Retrieved 3D profiles at (a) z = −750μm, (b) z = 30μm, and (c) z = 700μm.
Fig. 15
Fig. 15 (a) Appearance of the projected fringes on the tilted object. (b) Retrieved image.
Fig. 16
Fig. 16 Retrieved 3D profile of the tilted object.
Fig. 17
Fig. 17 (a) Appearance of an in-focused image through a microscope. (b) Retrieved image with a proper mask strength when the object was shifted with a displacement of 300μm behind the object plane.

Equations (12)

Equations on this page are rendered with MathJax. Learn more.

I i ( x i , y i , ψ ) = | h ( x i , y i , ψ ; x o , y o ) | 2 I o ( x o , y o ) d x o d y o ,
ψ = π λ ( 1 d o + 1 d i 1 f ) .
h ( x i , y i , ψ ; x o , y o ) = h ( x i M x o , y i M y o , ψ ) = A λ 2 d i d o P ( x , y ) exp [ j ψ ( x 2 + y 2 ) ] exp { j 2 π λ d i [ ( x i M x o ) x + ( y i M y o ) y ] } d x d y ,
H ( f x , f y , ψ ) = [ P ( τ x + f x / 2 , τ y + f y / 2 ) e j k 2 [ ( τ x + f x / 2 ) 2 + ( τ y + f y / 2 ) 2 ] ψ × P * ( τ x f x / 2 , τ y f y / 2 ) e j k 2 [ ( τ x f x / 2 ) 2 + ( τ y f y / 2 ) 2 ] ψ ] d τ x d τ y ,
{ I i ( x i , y i , ψ ) } = H ( f x , f y , ψ ) { I o ( x o , y o ) } ,
H ( f x , f y , ψ ) H ( f x , f y , 0 ) .
I o ( x o , y o ) = 1 { { I i ( x i , y i , ψ ) } H ( f x , f y , ψ ) } 1 { { I i ( x i , y i , ψ ) } H ( f x , f y , 0 ) } ,
P ( x , y ) = { 1 2 exp [ j α ( x 3 + y 3 ) ] , | x | L / 2 , | y | L / 2 0 , o t h e r w i s e ,
H ( f x , f y , ψ ) { π 12 α 1 | f x f y | exp [ j α 4 ( f x 3 + f y 3 ) ] exp [ j ψ 2 3 α ( f x + f y ) ] , i f f x f y 0 1 , i f f x f y = 0 .
H ( f x , f y , ψ ) H ( f x , f y , 0 ) = { π 12 α 1 | f x f y | exp [ j α 4 ( f x 3 + f y 3 ) ] , i f f x f y 0 1 , i f f x f y = 0 .
z ( φ ) = n = 0 N c n φ n ,
{ x = a 1 z + a o y = b 1 z + b o ,
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.