Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

High-speed and dense three-dimensional surface acquisition using defocused binary patterns for spatially isolated objects

Open Access Open Access

Abstract

The three-dimensional (3-D) shape measurement using defocused Ronchi grating is advantageous for the high contrast of fringe. This paper presents a method for measuring spatially isolated objects using defocused binary patterns. Two Ronchi grating with horizontal position difference of one-third of a period and an encoded pattern are adopted. The phase distribution of fringe pattern is obtained by Fourier analysis method. The measurement depth and range is enlarged because the third harmonic component and background illumination is eliminated with proposed method. The fringe order is identified by the encoded pattern. Three gray levels are used and the pattern is converted to binary image with error diffusion algorithm. The tolerance of encoded pattern is large. It is suited for defocused optical system. We also present a measurement system with a modified DLP projector and a high-speed camera. The 3-D surface acquisition speed of 60 frames per second (fps), with resolution of 640 × 480 points and that of 120 fps, with resolution of 320 × 240 points are archived. If the control logic of DMD was modified and a camera with higher speed was employed, the measurement speed would reach thousands fps. This makes it possible to analyze dynamic objects.

©2010 Optical Society of America

1. Introduction

Optical 3-D shape measurement becomes increasingly important, with applications in industrial inspection, quality control, machine vision, entertainment and biomedicine etc.. The 3-D profilometery based on sinusoidal fringe pattern projection is one of the important methods to acquire 3-D surface of object [13]. It is advantageous for the non-scanning nature and full-field performance. But it is difficult to retrieve the absolute phases for spatially isolated surfaces. In phase unwrapping [4,5], fringe order will be ambiguous. The depth difference between spatially isolated surfaces is then indiscernible.

Several projection algorithms have been proposed to unwrap the phases [610]. But they may slow measurement because too many patterns are employed. An alternate approach for spatially isolated objects is to use a structured encoded pattern [1113] instead of a fringe pattern. The structured pattern can be either a set of square patches or a set of parallel grids. These patterns are spatially encoded with gray scales or colors. Each patch or grid is then distinguishable by the encoded scheme. However, systematic accuracy is limited by the size of the patches or width of the grids. To enhance the accuracy, we propose methods which combine encoded patterns with sinusoidal fringe patterns.

There are primarily two approaches to high-speed 3-D shape measurement using sinusoidal fringes. In one approach, a single pattern is used, and the phase of deformed fringes is obtained by Fourier, wavelet or other analysis method [1418]. The resolution is relatively low because the information content of measurement system is limited. The other approach is to use multiple patterns. When they are switched rapidly, the patterns required to reconstruct one 3-D shape can be captured in a short period of time. Huang et al. proposed a method [19] in which three phase-shifted sinusoidal fringe patterns are projected. Zhang et al. improved this method [20] by adding a tiny marker to the projected fringe pattern. Absolute 3-D coordinates are obtained from the position of marker and the calibrated system parameters. Zhang et al. proposed a superfast phase-shifting method for 3-D measurement using a DMD projector and defocused Ronchi grating [21]. The projector can switch binary images up to 32,550 frames per second with a resolution of 1,024 × 768. It is still a challenge to measure the spatially isolated objects. We have proposed a method to realize high-speed and high-resolution 3-D measurement of spatially isolated objects [22]. Two sinusoidal fringes with phase difference of π and an encoded pattern are employed. Patterns with 256 gray levels are used. It is not ideal for a defocused system and fast projector based on DMD. In this research, we improved our previous method to make it suitable for defocused fast projector. Two Ronchi grating patterns with horizontal position difference of one-third of a period and an encoded pattern with three gray levels are adopted. The encoded pattern is converted to binary image with error diffusion algorithm. The phase of fringe pattern is obtained with improved Fourier Transform Profilometery (FTP). We have also constructed a measurement system with a high-speed camera and a specially designed DLP projector, which can project images up to 360 fps and can provide synchronization signal for camera. This paper describes the principle of this approach and presents some preliminary experimental results.

2. Principles

2.1 Improved FTP algorithm

The intensity of Ronchi grating is written as

g(x)=rect(xP/2)comb(xP),
where P is the period of grating. The asterisk denotes the operation of convolution. The Fourier series expansion of Eq. (1) is
g(x)=12+2πn=1(1)n12n1cos[2π(2n1)f0x],
where f 0 = 1/P is the fundamental frequency of Ronchi grating. The defocused optical system acts as a low-pass filter [23,24]. The higher harmonic components can be filtered out. By comparison, a sine-wave grating would have g(x) = 1/2 + 1/2cos(2πf 0 x), so the fundamental of the Ronchi grating is larger by (2/π)/(1/2) = 4/π . It means that the contrast of fringe is higher than that of focused sinusoidal fringe pattern. So, it is possible to achieve higher measurement accuracy and higher light usage efficiency. This is important for high-speed 3-D measurement.

The existence of higher harmonic components will cause phase measurement error. One method is to defocus the projector enough to filter out all of the higher harmonic components. The contrast of fundamental frequency component will decrease while the degree of defocus increases. In this research, we propose a method to eliminate the third harmonic component (n = 3 term in Eq. (2)). The measurement depth and contrast of fringes is increased compared to the conventional defocused method. When the fringe pattern with third harmonic component is projected onto objects, the intensity of captured image is written as

I0(x,y)=Ia(x,y)+a(x,y)cos[2πf0x+ϕ1(x,y)]+b(x,y)cos[2π3f0x+ϕ2(x,y)],
where Ia (x,y) is background illumination, a(x,y) and b(x,y) is relative to amplitude of the first and third harmonic components separately, ϕ 1(x, y) and ϕ 2(x, y) is the phase modulation caused by height distribution h(x, y). The π phase-shifting algorithm [25] is not suited for this condition, because the effect of the third harmonic component cannot be eliminated. When the Ronchi grating is shifted to the left/right one-third of a period, the intensity of defocused fringe pattern is written as
I1(x,y)=Ia(x,y)+a(x,y)cos[2πf0x+ϕ1(x,y)+2π/3]+b(x,y)cos[2π3f0x+ϕ2(x,y)+2π]=Ia(x,y)+a(x,y)cos[2πf0x+ϕ1(x,y)+2π/3]+b(x,y)cos[2π3f0x+ϕ2(x,y)].
Subtracting Eq. (4) from Eq. (3), we get
I(x,y)=a(x,y){cos[2πf0x+ϕ1(x,y)]cos[2πf0x+ϕ1(x,y)+2π/3)]}=3a(x,y){cos(π/6)cos[2πf0x+ϕ1(x,y)]+sin(π/6)sin[2πf0x+ϕ1(x,y)]}=3a(x,y)cos[2πf0x+ϕ1(x,y)π/6].
It is found from Eq. (5) that the third harmonic component and background illumination are eliminated. Then, the phase distribution is obtained with the algorithm similar to π phase-shifting FTP. Performing Fourier transform, filter and inverse Fourier transform serially to Eq. (5), we get
I^0(x,y)=32a(x,y)exp{i[2πf0x+ϕ1(x,y)π/6]}.
The wrapped phase of fringe pattern is obtained with
ϕw(x,y)=arg[I^0(x,y)exp(iπ/6)],
where arg[•] denotes the operation of getting argument of a complex number.

2.2 Codification strategy of encoded pattern

In order to obtain the absolute phase distribution of spatially isolated objects, we propose an encoded pattern to identify the order of sinusoidal fringe. Figure 1 shows an example. The pattern consists of numbers of vertical slits. The width of slits is equal to the period of sinusoidal fringe. Three gray levels (0 denotes black,1 denotes gray and 2 denotes white) are used to form the slits. Every slit is filled with one or two gray levels. For the slits filled with two gray levels, they are filled periodically in the vertical direction. For example, the gray level of the slit is “020202…” in vertical direction. So, there are six kinds of slits. They are three kinds of slits with one gray level (0, 1 or 2) and three kinds of slits with two gray levels (0 and 1, 0 and 2 or 1 and 2). Six symbols (from ‘A’ to ‘F’) are assigned to these six kinds of slits. These slits are arranged to form the encoded pattern according to a pseudo random sequence. The sequence has following property. (1) Any subsequence with a given length (window size) should appear only once in the whole sequence. (2) There are no repeated symbols in every subsequence. For example, any subsequence with length of four characters appears only once in character sequence “ABDECFADBEFDBECDABFECB DEFBDCEABCDAECFBDE”, and there are no repeated characters in these subsequences. We obtain the pseudo random sequence by searching a Hamilton Circuit over a directed graph [26]. According to the theory of permutation, the number of permutations L generated by selecting M elements from K elements can be expressed with

L=PkM=K!(KM)!.
So, the length of pseudo random sequence is L + M-1.

 figure: Fig. 1

Fig. 1 Example of encoded pattern.

Download Full Size | PDF

The fringe order k is identified by the position of subsequence in the whole sequence. Figure 2 illustrates the relationship of fringe intensity I, wrapped phase φw, fringe order k and absolute phase φa. In the figure, k′ is the global position of symbol in subsequence. For example, the position of subsequence “DECF” in above pseudo random sequence is 2. The global position of ‘D’ is 2, that of ‘E’ is 3 and that of ‘F’ is 5. It is also found in Fig. 2 that the phase is wrapped in the slits. So, the fringe order of areas at left side of points where the phase jump from π to –π (jump point) is equal to k′. And that of areas at right side of jump point is equal to k′ + 1. The advantage of this encoding method is that the fringe order will not be error even if the edge of slits drifts. The edge drift is natural in defocused optical system, especially in measurement of dynamic objects. When the fringe order k is determined, the absolute phase ϕa(x, y) of fringe can be expressed as

φa(x,y)=φw(x,y)+2kπ,
where ϕw(x, y) is wrapped phase.

 figure: Fig. 2

Fig. 2 Relationship of fringe intensity, wrapped phase, fringe order and absolute phase.

Download Full Size | PDF

There are three gray levels in the encoded pattern. It should be converted to binary image because only two gray levels are used in our method. The error diffusion algorithm is the general approach to convert a multi-level image into a binary image. The quantization error of image is moved to high-frequency section in frequency domain and is represented as high frequency noise. Most amount of the high frequency noise is filtered out in the defocused optical system. High quality multi-level image is obtained even though represented with two gray levels. So, we convert the encoded pattern to binary image with this algorithm.

2.3 Decoding algorithm

In the captured image, the sum of ambient and projector illumination multiplies the object reflectivity. It should be preprocessed to eliminate this effect before recovering symbol. The intensity of the captured encoded pattern can be expressed as

I3(x,y)=Ia(x,y)+Ip(x,y),
where Ip(x, y) is relative to intensity of encoded pattern. From Eq. (3), (6), (7) and (10), we can obtain the normalized intensity
Ic(x,y)=3[I3(x,y)I0(x,y)]+2|I^0(x,y)|cos[ϕw(x,y)]2|I^0(x,y)|.
The ambient illumination and object reflectivity are eliminated. Ic(x, y) is relative to the intensity of encoded pattern and the third harmonic component. Then, it is quantified with three gray levels (0, 1 and 2). The third harmonic component will not affect the result of quantization due to the large tolerance of encoded pattern. The fringe order is identified with the method proposed in Ref [22].

We can also get the 2-D texture of objects. From Eq. (3) and (6), we get

T(x,y)=3[I3(x,y)I0(x,y)]+2|I^0(x,y)|cos[ϕw(x,y)].
Then, perform band stop filter to Eq. (12). The third harmonic component is filtered out and the texture of objects is obtained.

3 Experimental results

3.1 Experimental setup

We modified a development kit of DLP projector. It can project images up to 360 fps with resolution of 1024 × 768 pixels. The projector can also provide pulse signal according to the time when a new image is projected. The white light LED with power of 10 watts was selected as its light source. A high-speed camera (Model RM6740GE) which was synchronized with projector was used to capture the deformed image. Figure 3 shows a photograph of measurement system. The pseudo random sequence which we adopted is “ABDECFACDCEFBEADFEBC AEDAFDBFCBADEFCABEDCFBA”. Its window size is three symbols. The 3-D reconstruction procedure is described in detail as following.

 figure: Fig. 3

Fig. 3 Measurement system

Download Full Size | PDF

  • (1) Calculate the wrapped phase with improved phase-shifting FTP.
  • (2) Find out the curve where phase jump greater than π (jump curve) in wrapped phase map.
  • (3) Normalize and quantify the intensity of corresponding curve in captured encoded pattern.
  • (4) Recover the symbol of jump curve and obtain the fringe order by subsequence matching.
  • (5) Reconstruct 3-D coordinates of object.

3.2. Experimental results

Two experimental results are demonstrated. Firstly, we recorded a scene which consists of spatially isolated and dynamic objects. The 3-D shape was recorded at 120 fps with resolution of 320 × 240 points. Figure 4 shows a selected photograph of a scene with fringes and the movie of a reconstructed 3-D image. The toy on the right side of board vibrates left and right periodically. Depth is represented with color. This video shows that the shape and position of spatially isolated and dynamic objects can be captured with proposed approach if only enough fringes are projected.

 figure: Fig. 4

Fig. 4 Experimental result of spatially isolated dynamic objects. (a) One selected photograph of scene with fringe. (b) Movie of reconstructed 3-D image (Media 1).

Download Full Size | PDF

Secondly, we recorded the change of facial expressions of human. A woman was asked to speak and smile. The 3-D shape was recorded at 60 fps with resolution of 640 × 480 points. Figure 5 shows six selected frames. The 3-D shape is represented with 3-D mesh and grayscale texture. A movie is also included. This video shows that the detail of facial expression change can be recorded well with proposed system.

 figure: Fig. 5

Fig. 5 Measurement result of facial expressions (Media 2).

Download Full Size | PDF

4. Conclusions

A method for measuring spatially isolated objects with high-speed even superfast speed and high-resolution is proposed. Three binary patterns are employed. They are projected with a high-speed defocused projector and high quality grayscale images are obtained. The method is suited for high-speed projector which can only project binary image. The 2-D texture of objects is captured simultaneously with proposed method. A system which consists of a high-speed camera and a specially designed DLP projector is also presented. The 3-D measurement speed of 60 fps with resolution of 640 × 480 points and that of 120 fps with resolution of 320 × 240 points is achieved. We successfully demonstrate that the 3-D shape and 2-D texture of low-speed dynamic and/or spatially isolated objects can be recorded.

Acknowledgments

This work is supported by the National Natural Science Foundation of China under grants No. 60702078.

References and links

1. F. Chen, G. M. Brown, and M. Song, “Overview of three-dimensional shape measurement using optical methods,” Opt. Eng. 39(1), 10–22 (2000). [CrossRef]  

2. F. Blais, “Review of 20 Years of Range Sensor Development,” J. Elect. Imag. 13(1), 231–240 (2004). [CrossRef]  

3. J. Salvi, S. Fernandez, T. Pribanic, and X. Llado, “A state of the art in structured light patterns for surface profilometry,” Pattern Recognit. 43(8), 2666–2680 (2010). [CrossRef]  

4. T. R. Judge and P. J. Bryanston-Cross, “Review of phase unwrapping techniques in fringe analysis,” Opt. Lasers Eng. 21(4), 199–239 (1994). [CrossRef]  

5. X. Su and W. Chen, “Reliability-guided phase unwrapping algorithm: a review,” Opt. Lasers Eng. 42(3), 245–261 (2004). [CrossRef]  

6. H. Zhao, W. Chen, and Y. Tan, “Phase-unwrapping algorithm for the measurement of three-dimensional object shapes,” Appl. Opt. 33(20), 4497–4500 (1994). [CrossRef]   [PubMed]  

7. W. Nadeborn, P. Andrä, and W. Osten, “A robust procedure for absolute phase measurement,” Opt. Lasers Eng. 24(2-3), 245–260 (1996). [CrossRef]  

8. H. O. Saldner and J. M. Huntley, “Temporal phase unwrapping: application to surface profiling of discontinuous objects,” Appl. Opt. 36(13), 2770–2775 (1997). [CrossRef]   [PubMed]  

9. Y. Hao, Y. Zhao, and D. Li, “Multifrequency grating projection profilometry based on the nonlinear excess fraction method,” Appl. Opt. 38(19), 4106–4110 (1999). [CrossRef]  

10. E. B. Li, X. Peng, J. Xi, J. F. Chicharo, J. Q. Yao, and D. W. Zhang, “Multi-frequency and multiple phase-shift sinusoidal fringe projection for 3D profilometry,” Opt. Express 13(5), 1561–1569 (2005), http://www.opticsinfobase.org/oe/abstract.cfm?URI=OPEX-13-5-1561. [CrossRef]   [PubMed]  

11. P. Vuylsteke and A. Oosterlinck, “Range image acquisition with a single binary-encoded light pattern,” IEEE Trans. Pattern Anal. Mach. Intell. 12(2), 148–164 (1990). [CrossRef]  

12. W. Liu, Z. Wang, G. Mu, and Z. Fang, “Color-coded projection grating method for shape measurement with a single exposure,” Appl. Opt. 39(20), 3504–3508 (2000). [CrossRef]  

13. S. Y. Chen and Y. F. Li, “Self-recalibration of a colour-encoded light system for automated three-dimensional measurements,” Meas. Sci. Technol. 14(1), 33–40 (2003). [CrossRef]  

14. M. Takeda and K. Mutoh, “Fourier transform profilometry for the automatic measurement of 3-D object shapes,” Appl. Opt. 22(24), 3977–3982 (1983). [CrossRef]   [PubMed]  

15. P. S. Huang, Q. Hu, F. Jin, and F. P. Chiang, “Color-Encoded Digital Fringe Projection Technique for High-Speed Three-Dimensional Surface Contouring,” Opt. Eng. 38(6), 1065–1071 (1999). [CrossRef]  

16. C. Guan, L. G. Hassebrook, and D. L. Lau, “Composite structured light pattern for three-dimensional video,” Opt. Express 11(5), 406–417 (2003), http://www.opticsinfobase.org/abstract.cfm?URI=oe-11-5-406. [CrossRef]   [PubMed]  

17. Q. C. Zhang and X. Y. Su, “High-speed optical measurement for the drumhead vibration,” Opt. Express 13(8), 3110–3116 (2005), http://www.opticsinfobase.org/abstract.cfm?URI=oe-13-8-3110. [CrossRef]   [PubMed]  

18. W. H. Su, “Projected fringe profilometry using the area encoded algorithm for spatially isolated and dynamic objects,” Opt. Express 16, 2590–2596 (2008), http://www.opticsinfobase.org/abstract.cfm?URI=oe-16-4-2590.

19. P. S. Huang, C. Zhang, and F. P. Chiang, “High-Speed 3D Shape Measurement Based on Digital Fringe Projection,” Opt. Eng. 42(1), 163–168 (2003). [CrossRef]  

20. S. Zhang and S.-T. Yau, “High-resolution, real-time 3D absolute coordinate measurement based on a phase-shifting method,” Opt. Express 14(7), 2644–2649 (2006), http://www.opticsinfobase.org/abstract.cfm?URI=oe-14-7-2644. [CrossRef]   [PubMed]  

21. S. Zhang, D. Van Der Weide, and J. Oliver, “Superfast phase-shifting method for 3-D shape measurement,” Opt. Express 18(9), 9684–9689 (2010), http://www.opticsinfobase.org/abstract.cfm?uri=oe-18-9-9684. [CrossRef]   [PubMed]  

22. Y. Li, K. Y. Jin, H. Z. Jin, and H. Wang, “High-resolution, High-speed 3D Measurement Based on Absolute Phase Measurement,” in Proceeding of International Conference on Advanced Phase Measurement Methods in Optics and Imaging, 2010, pp.389–394.

23. X. Y. Su, W. S. Zhou, G. Vonbally, and D. Vukicevic, “Automated phase-measuring profilometry using defocused projection of a Ronchi Grating,” Opt. Commun. 94(6), 561–573 (1992). [CrossRef]  

24. S. Lei and S. Zhang, “Digital Sinusoidal Fringe Pattern Generation: Defocusing Binary Patterns VS Focusing Sinusoidal Patterns,” Opt. Lasers Eng. 48(5), 561–569 (2010). [CrossRef]  

25. J. Li, X. Y. Su, and L. R. Guo, “Improved Fourier transform profilometry for the automatic measurement of 3D object shapes,” Opt. Eng. 29(12), 1439 (1990). [CrossRef]  

26. Y. Li, H. Z. Jin, and H. Wang, “Three-dimensional shape measurement using binary spatio-temporal encoded illumination,” J. Opt. A, Pure Appl. Opt. 11(7), 075502 (2009). [CrossRef]  

Supplementary Material (2)

Media 1: MOV (747 KB)     
Media 2: MOV (2209 KB)     

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (5)

Fig. 1
Fig. 1 Example of encoded pattern.
Fig. 2
Fig. 2 Relationship of fringe intensity, wrapped phase, fringe order and absolute phase.
Fig. 3
Fig. 3 Measurement system
Fig. 4
Fig. 4 Experimental result of spatially isolated dynamic objects. (a) One selected photograph of scene with fringe. (b) Movie of reconstructed 3-D image (Media 1).
Fig. 5
Fig. 5 Measurement result of facial expressions (Media 2).

Equations (12)

Equations on this page are rendered with MathJax. Learn more.

g ( x ) = r e c t ( x P / 2 ) c o m b ( x P ) ,
g ( x ) = 1 2 + 2 π n = 1 ( 1 ) n 1 2 n 1 cos [ 2 π ( 2 n 1 ) f 0 x ] ,
I 0 ( x , y ) = I a ( x , y ) + a ( x , y ) cos [ 2 π f 0 x + ϕ 1 ( x , y ) ] + b ( x , y ) cos [ 2 π 3 f 0 x + ϕ 2 ( x , y ) ] ,
I 1 ( x , y ) = I a ( x , y ) + a ( x , y ) cos [ 2 π f 0 x + ϕ 1 ( x , y ) + 2 π / 3 ] + b ( x , y ) cos [ 2 π 3 f 0 x + ϕ 2 ( x , y ) + 2 π ] = I a ( x , y ) + a ( x , y ) cos [ 2 π f 0 x + ϕ 1 ( x , y ) + 2 π / 3 ] + b ( x , y ) cos [ 2 π 3 f 0 x + ϕ 2 ( x , y ) ] .
I ( x , y ) = a ( x , y ) { cos [ 2 π f 0 x + ϕ 1 ( x , y ) ] cos [ 2 π f 0 x + ϕ 1 ( x , y ) + 2 π / 3 ) ] } = 3 a ( x , y ) { cos ( π / 6 ) cos [ 2 π f 0 x + ϕ 1 ( x , y ) ] + sin ( π / 6 ) sin [ 2 π f 0 x + ϕ 1 ( x , y ) ] } = 3 a ( x , y ) cos [ 2 π f 0 x + ϕ 1 ( x , y ) π / 6 ] .
I ^ 0 ( x , y ) = 3 2 a ( x , y ) exp { i [ 2 π f 0 x + ϕ 1 ( x , y ) π / 6 ] } .
ϕ w ( x , y ) = arg [ I ^ 0 ( x , y ) exp ( i π / 6 ) ] ,
L = P k M = K ! ( K M ) ! .
φ a ( x , y ) = φ w ( x , y ) + 2 k π ,
I 3 ( x , y ) = I a ( x , y ) + I p ( x , y ) ,
I c ( x , y ) = 3 [ I 3 ( x , y ) I 0 ( x , y ) ] + 2 | I ^ 0 ( x , y ) | cos [ ϕ w ( x , y ) ] 2 | I ^ 0 ( x , y ) | .
T ( x , y ) = 3 [ I 3 ( x , y ) I 0 ( x , y ) ] + 2 | I ^ 0 ( x , y ) | cos [ ϕ w ( x , y ) ] .
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.