Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Projected fringe profilometry with multiple measurements to form an entire shape

Open Access Open Access

Abstract

A 3D sensing method to retrieve an entire shape from many segmented profiles is described. Image registration is not required in this method. Advantages of this method also include (1) very high integration accuracy, (2) improved robustness, (3) reduced computational time, (4) very low computation cost for the data fusion, and (5) capability of compensating distortions of the optical system at every pixel location.

©2008 Optical Society of America

1. Introduction

3D sensing measurements using fringe projection schemes [1–5] are one of the well-known methods because of its non-scanning and full-field nature. It uses a fringe pattern projected onto the inspected object and evaluates the phase distribution from another viewpoint. The distorted fringes contain depth information and therefore are desirable to retrieve the inspected shape. However, when objects of interested are too complicated, shading and obstruction appears. Data points on that area are therefore lost. It is difficult to inspect the entire surface from one viewpoint. Segmented measurements from different partial views are generally required to form an entire shape.

The entire inspected shape should be segmented to different regions, which are overlapped with each other. Each segmented region is then inspected by a 3D sensor system from a partial view. The overlapped regions provide useful information for addressing the correspondence between two segmented coordinate systems. Once the correspondence in the overlapped region is known, the results obtained at different perspective angles can be transformed to a common coordinate system for the correct fusion of the partial data sets.

Searching for the matched points from the associated data sets in the overlapped regions is called image registration. It plays a key role to identify the correspondence between two coordinates systems. There is extensive research on the registration schemes. Most of these schemes can be categorized as area-based algorithms [6], and feature-based algorithms [7]. Another well-known scheme, the so-called active stereo vision [8, 9] utilizes structured light projected onto the inspected surface to perform the image registration. Searching for two matched surface points is simplified as a task to indicate the matched features created by the projection of structured light.

Even though all the above registration schemes are well performed and can be employed to integrate the data sets, limitations still happen. All of these registration methods require subpixel accuracy to ensure the integration accuracy. This makes the computation cost relatively high. Enormous errors also exist when the overlapped region is not large enough. Besides, inaccurate transformation from segmented coordinate systems to a reference system is harmful to the integration performance. The overall measurements become time consuming and make it impractical to real-time operation.

In this paper, an integration method to reconstruct the 3D shape from a complicated object is presented. Image registration is not required in our setup. Errors from registration are therefore eliminated. Since there is no need to identify the correspondence between two segmented systems, computation cost is reduced as well. Each segmented 3D shape sensor is calibrated with the same calibration tools. Thus, systematic accuracy is only determined by the accuracy of each segmented measurement. In addition, with this calibration scheme, it is desirable to compensate perspective errors and lens distortions at every pixel location.

2. Principle

2.1 Photogrammetry of a projected fringe system

Figure 1 shows the optical setup of the 3D shape sensor, which is used for segmented 3D measurements. A one-dimensional fringe pattern that lies in the xg-yg plane with fringes normal to xg-axis is projected to the inspected object. Phase distribution can be mathematically expressed as

[xgyg]=[φ·d2π0],

where d is the period of the fringes.

Shape of the tested object is identified in the world coordinate system (x, y, z). The relationship between the fringe coordinates and the world coordinates is given by

[x+Δxy+Δyz+Δz]=[r11(p)r12(p)r21(p)r22(p)r31(p)r32(p)][φ·d2π0]+[t1(p)t2(p)t3(p)].

where rij are coefficients signified with magnification and coordinates rotation, and ti are shifting parameters. The superscript (p) denotes a projector parameter. Distortion from the projection lens is addressed with Δx, Δy, and Δz.

 figure: Fig. 1.

Fig. 1. Schematic setup of projected fringe profilometry.

Download Full Size | PDF

Fringe deformation on the object is then imaged to the detection plane, which is located in the coordinate system (xd, yd). A similar relationship between the detector coordinates and the world coordinates is represented as

[xd+Δxdyd+Δyd]=[r11(c)r12(c)r13(c)r21(c)r22(c)r23(c)][xyz]+[t1(c)t2(c)].

The superscript (c) denotes a camera parameter. For a fixed detection location, the terms on the left side of Eq. (3) are two constants. Equation (3) is therefore simplified as the following expression:

{x=a0+a1zy=b0+b1z,

where ai and bi are constants, and can be determined with a proper calibration scheme. Substituting (4) into Eq. (2), the variable z can be represented as a function of phase. Even though distortions of the projection lens are taken into account, it still can be approximately expressed as

z=n=0Ncnφn.

The parameters ci can be carried out with a proper calibration scheme. In Section 2.2, a calibration procedure is described.

2.2 Parameter identification

As shown in Fig. 2(a), a standard diffusive flat plate is applied to identify the parameters ci for a 3D sensing system. A sinusoidal pattern is projected onto the flat surface perpendicular to the z direction. Fringes on the flat surface are then captured by a CCD detector. The phase measurement is repeated as the flat is successively translated to different z positions. With Fourier transform method [1] or phase-shifting technique [2], wrapped phases can be found out. Unwrapping is then performed to recover the true phase from modulo 2π data. Thus, a set of absolute phases and the associated z positions are obtained at each pixel location. With the subsequent curve fitting process, φ-to-z relation at each pixel is determined. Parameters ci are therefore carried out by the curve-fitting algorithm.

To identify the parameters ai and bi, a 2D fringe pattern is used as a calibration tool. The reflectivity of this pattern is sinusoidal and is expressed as

R(x,y)=12+14cos(2πdxx)+14cos(2πdyy),

where dx and dy are known periods in x- and y- direction, respectively.

This pattern is sequentially placed at the known plane z=zi in front of the CCD camera, as shown in Fig. 2(b). Images of the fringe pattern at various z positions are sequentially captured by the CCD camera. Thus a series of 2D absolute phases and the corresponding z values are obtained at each pixel location. Since the periods of the 2D fringes are known, absolute phases can be used to represent the associated transverse positions. Thus, the relationship between x and z (or y and z) can be found out. With a simple algebra manipulation, the parameters ai and bi can be evaluated.

 figure: Fig. 2.

Fig. 2. Configuration to identify parameters of (a) the φ-to-z relationship, and (b) the z-to-x relationship or the z-to-y relationship.

Download Full Size | PDF

Note that the detection plane does not need to perpendicular to the optical axis of the imaging lens. This makes it more flexible to retrieve shapes from arbitrary viewpoints.

2.3 Calibration of multiple measurements

Figure 3 illustrates a setup of the entire-shape measurement. Each sensing system performs segmented measurements from its partial view. As described in section 2.1, the retrieved shape from a partial view can be expressed as

 figure: Fig. 3.

Fig. 3. Segmented measurements performed by projected fringe profilometries.

Download Full Size | PDF

{z=n=0Ncn(l)·(φ(l))nx=a0(l)+a1(l)zy=b0(l)+b1(l)z,

where the superscript (l) denotes a parameter from the l th sensing system.

 figure: Fig. 4.

Fig. 4. Configuration to identify parameters of the φ-to-z relationship.

Download Full Size | PDF

 figure: Fig. 5.

Fig. 5. Configuration to identify parameters of the z-to-x relationship and the z-to-y relationship.

Download Full Size | PDF

Calibration procedures described in section 2.2 are employed to identify the parameters in each sensing system. Setups to determine the φ-to-z conversion and z-to-x (or y) conversion are shown in Fig. 4 and Fig. 5, respectively. The calibration tool (either the optical flat or the 2D pattern) is placed at the known plane z=zi. Distribution of fringes on the calibration tool is recorded by all the CCD cameras at the same time. The phase measurement is repeated at different z positions and stops when enough measurements are obtained to execute the curve-fitting algorithm. Thus, for each sensor system, a series of absolute phases and the associated depths are obtained at each pixel location. The relationships between φ (l) and z, z and x, and z and y in each sensor system are determined. Parameters rs a (l) i, b (l) i, and c (l) i are then identified by the curve-fitting algorithm.

All the sensing systems are calibrated with the same calibration tools, which are placed in the world coordinates (x, y, z). Thus, all the segmented profiles are retrieved in the same coordinate system. Integration becomes a simple task to display all point clouds in the coordinate system. Once all the sensing systems are calibrated, all the optical elements should be rigid to ensure the repeatability of data fusion.

4. Experiments

In our setup, a bowl with approximately 120 mm in diameter and 40 mm in depth was chosen as the inspected sample. Two 3D sensing systems located at different viewpoints were used to perform the segmented measurements. In each system, the projected fringes were observed by a CCD camera with 1024×1024 pixels at 12-bit pixel resolution. The recorded images from two viewpoints are shown in Figs. 6(a) and 6(b), respectively. Data on the edge of the bowl were lost because of shading.

Phases are evaluated using the phase-shifting technique. Figures 7(a) and 7(b) show the computed phases, which were within the interval between −π and π. Phase unwrapping was then performed to eliminate the discontinuities. In our experiment, we use Goldstein’s algorithm [10] to restore the absolute phases. Errors occur during the unwrapping procedure because of phase noise, surface anomalies, and insufficient sampling densities. These errors may propagate over a large area and spoil many pixels. Shown in Figs. 8(a) and 8(b) are the unwrapped results. Unwrapped errors appear on the beveled edge around the shading area.

 figure: Fig. 6.

Fig. 6. Fringes on the inspected object observed by a sensing system at (a) the left point of view, and (b) the right point of view.

Download Full Size | PDF

 figure: Fig. 7.

Fig. 7. Phase-extraction using the phase-shifting technique for the sensing system at (a) the left point of view, and (b) the right point of view.

Download Full Size | PDF

 figure: Fig. 8.

Fig. 8. Appearance of unwrapped phases obtained by the sensing system at (a) the left point of view, and (b) the right point of view.

Download Full Size | PDF

With Eq. (7), the absolute phase at each pixel can be used to determine the z position, and then the z value is translated to carry out the x- and y- location. The relationship between φ (l) and z can be more accurate as the order of the polynomial, N, is increased. Our experiments had shown that accuracy of the coefficients could be achieved in the order of microns while N=4. Thus, at least 5 measurements at different z positions were required to identify the parameters. In our setup, there were 20 measurements employed to perform the curve-fitting algorithm. Calibration range along the z-axis was 50 mm, which ensured that all the data points were obtained within the calibration area. With the phase-shifting technique, depth accuracy for each segmented system was approximately 5µm. However, transverse accuracy was only 127 µm, which was mainly determined by the sampling density of the CCD camera.

Retrieved profiles from the two partial views in Rectangular Coordinates are shown in Figs. 9(a) and 9(b). Note that Figs. 9(a) and 9(b) have the same dimensions. Thus, there was no need to identify any transformation between the segmented systems. Systematic accuracy was then only determined by the accuracy of each segmented measurement. Figure 9(c) shows the meshed data sets in which the uncorrected ones had been thrown away.

 figure: Fig. 9.

Fig. 9. (a). Retrieved profile from the left partial view. (b). Retrieved profile from the left partial view. (c). Integrated profile from (a) and (b).

Download Full Size | PDF

7. Conclusion

We have presented a method to integrate segmented measurements performed by calibration-based projected fringe profilometry. Unlike conventional algorithms, image registration was not required in this setup. All the segmented sensing systems were calibrated with the same calibration tools. Thus, all the segmented profiles were retrieved in the same coordinate system. Integration becomes a simple task to display all the point clouds in this coordinate system. Among all the integration schemes, this method is superior since it offers many major advantages, including: (1) very low computation cost for the data fusion, (2) reduced computational time, and (3) very high integration accuracy.

References and links

1. M. Takeda and K. Mutoh, “Fourier transform profilometry for the automatic measurement of 3-D object shaped,” Appl. Opt. 22, 3977–3982 (1983). [CrossRef]   [PubMed]  

2. V. Srinivasan, H. C. Liu, and M. Halioua, “Automated phase-measuring profilometry of 3-D diffuse objects,” Appl. Opt. 23, 3105–3108 (1984). [CrossRef]   [PubMed]  

3. D. R. Burton and M. J. Lalor, “Multichannel Fourier fringe analysis as an aid to automatic phase unwrapping,” Appl. Opt. 33, 2939–2948 (1994). [CrossRef]   [PubMed]  

4. W. H. Su and H. Liu, “Calibration-based two frequency projected fringe profilometry: a robust, accurate, and single-shot meaurement for objects with large depth discontinuities,” Opt. Express 14, 9178–9187 (2006). [CrossRef]   [PubMed]  

5. W. H. Su, “Color-encoded fringe projection for 3D shape measurements,” Opt. Express 15, 13167–13181 (2007). [CrossRef]   [PubMed]  

6. A. W. Gruen, “Geometrically constrained multiphoto matching,” Photogramm. Eng. Remote Sens. 54, 633–641 (1988).

7. L. G. Brown, “A survey of image registration techniques,” ACM Comput. Surv. 24, 325–376, (1992). [CrossRef]  

8. C. Reich, R. Ritter, and J. Thesing, “3-D shape measurement of complex objects by combining photogrammetry and fringe projection,” Opt. Eng. 39, 224–231 (2000). [CrossRef]  

9. A. Dipanda, S. Woo, F. Marzani, and J. M. Bilbault, “3-D shape reconstruction in an active stereo vision system using genetic algorithms” Pattern Recog. 36, 2143–2159 (2003). [CrossRef]  

10. E. Zappa and G. Busca, “Comparison of eight unwrapping algorithms applied to Fourier-transform profilometry,” Opt. Lasers Eng. 46, 106–116 (2008). [CrossRef]  

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (9)

Fig. 1.
Fig. 1. Schematic setup of projected fringe profilometry.
Fig. 2.
Fig. 2. Configuration to identify parameters of (a) the φ-to-z relationship, and (b) the z-to-x relationship or the z-to-y relationship.
Fig. 3.
Fig. 3. Segmented measurements performed by projected fringe profilometries.
Fig. 4.
Fig. 4. Configuration to identify parameters of the φ-to-z relationship.
Fig. 5.
Fig. 5. Configuration to identify parameters of the z-to-x relationship and the z-to-y relationship.
Fig. 6.
Fig. 6. Fringes on the inspected object observed by a sensing system at (a) the left point of view, and (b) the right point of view.
Fig. 7.
Fig. 7. Phase-extraction using the phase-shifting technique for the sensing system at (a) the left point of view, and (b) the right point of view.
Fig. 8.
Fig. 8. Appearance of unwrapped phases obtained by the sensing system at (a) the left point of view, and (b) the right point of view.
Fig. 9.
Fig. 9. (a). Retrieved profile from the left partial view. (b). Retrieved profile from the left partial view. (c). Integrated profile from (a) and (b).

Equations (7)

Equations on this page are rendered with MathJax. Learn more.

[ x g y g ] = [ φ · d 2 π 0 ] ,
[ x + Δ x y + Δ y z + Δ z ] = [ r 11 ( p ) r 12 ( p ) r 21 ( p ) r 22 ( p ) r 31 ( p ) r 32 ( p ) ] [ φ · d 2 π 0 ] + [ t 1 ( p ) t 2 ( p ) t 3 ( p ) ] .
[ x d + Δ x d y d + Δ y d ] = [ r 11 ( c ) r 12 ( c ) r 13 ( c ) r 21 ( c ) r 22 ( c ) r 23 ( c ) ] [ x y z ] + [ t 1 ( c ) t 2 ( c ) ] .
{ x = a 0 + a 1 z y = b 0 + b 1 z ,
z = n = 0 N c n φ n .
R ( x , y ) = 1 2 + 1 4 cos ( 2 π d x x ) + 1 4 cos ( 2 π d y y ) ,
{ z = n = 0 N c n ( l ) · ( φ ( l ) ) n x = a 0 ( l ) + a 1 ( l ) z y = b 0 ( l ) + b 1 ( l ) z ,
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.