Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Accurate and fast 3D surface measurement with temporal-spatial binary encoding structured illumination

Open Access Open Access

Abstract

Balancing the accuracy and speed for 3D surface measurement of object is crucial in many important applications. Binary encoding pattern utilizing the high-speed image switching rate of digital mirror device (DMD)-based projector could be used as the candidate for fast even high-speed 3D measurement, but current most schemes only enable the measurement speed, which limit their application scopes. In this paper, we present a binary encoding method and develop an experimental system aiming to solve such a situation. Our approach encodes one computer-generated standard 8 bit sinusoidal fringe pattern into multiple binary patterns (sequence) with designed temporal-spatial binary encoding tactics. The binary pattern sequence is then high-speed and in-focus projected onto the surface of tested object, and then captured by means of temporal-integration imaging to form one sinusoidal fringe image. Further the combination of phase-shifting technique and temporal phase unwrapping algorithm leads to fast and accurate 3D measurement. The systematic accuracy better than 0.08mm is achievable. The measurement results with mask and palm are given to confirm the feasibility.

© 2016 Optical Society of America

1. Introduction

Three dimensional surface measurement on basis of structured light illumination, especially the most popular digital sinusoidal fringes projection, has been found in many applications [1,2]. Unfortunately, the sinusoidal fringe projection is facing some insurmountable issues to perform such a task relying on off-the-shelf digital projectors, because one typical sinusoidal fringe pattern is of 8 bits, the measurement speed is limited by the maximum image switching rate (typically 120Hz) of the projector [3]. On the other hand, most of the video projectors include nonlinearity for good visual perception, it is difficult to generate high quality sinusoidal fringe without careful nonlinearity correction beforehand [3]. For this purpose, researchers try to develop some nonlinear correction techniques [4, 5] to figure out this issue. However, Li found that the nonlinearity of projector is not constant overtime [6], such that these correction techniques are less than ideal and labor-consuming.

Recently, many researchers devote themselves to studying binary encoding strategies to simultaneously achieve high speed and high accuracy 3D measurement [6]. The binary encoding techniques, such as pulse width modulation (PWM) techniques [7], area-modulation techniques [8], dithering techniques [9] and their optimized variations [10–13], to firstly circumvent the nonlinearity (due to only two intensity values being used), and then to achieve high speed projection by virtue of currently available digital light process (DLP) technology with its inherit high image switching rate. Further through defocused projection, approximate sinusoidal fringe patterns can be generated, and making fast 3D measurement possible.

Su proposed a method to generate ideal sinusoidal fringe pattern by substantially defocusing squared wave fringes [14]. And the accuracy can be further improved if well-chosen phase-shifting steps or larger degree of defocus are used. However, a smaller depth measurement depth is also accompanied as a result of low image contrast.

Modulating the squared binary patterns with sinusoidal PWM presented by Ayubi et al. [15] is a good scheme when the fringe period is confined to a certain range. The authors cleverly shift the harmonics far away from the fundamental frequency such that slight deviations from focus (nearly focus) could lead to high-quality sinusoidal fringe patterns. An optimized variant of this technique from Wang [10] struggles to selectively eliminate the error-causing harmonics of a squared binary pattern by making notches. However, admittedly, only when the fringe stripes are wide, the effect is significant for improving the measurement accuracy while when the fringe stripe is narrow their influence is not evident.

One can be aware that to a large extent, PWM techniques have limited improvement for its one dimensional modulation nature. Naturally, some researchers turn their sights on two dimensional modulation-based schemes. Lohry and Zhang et. al [8] proposed area-modulation based approach that locally modulates the pixels to emulate a triangular pattern in order to reduce the influences from high-frequency harmonics. Although it permits to use nearly focused projection, the improvement is limited in case of large period fringe, in this regard, which is similar to PWM techniques.

Halftoning techniques (dithering) inspired from printing industry like error diffusion technique, Bayer-dithering technique [9], are taken advantage of to characterize grayscale sinusoidal fringe pattern. Unfortunately, for 3D measurement system, directly using the dithering techniques, the improvement contributed to the measurement result is limited if the fringe period is small, leading to an unsatisfactory level. Lately, instead of directly using a matrix to carry out the binary encoding of the original intensity, S. Zhang et.al developed several optimized algorithms [10–12] on the basis of existing dithering techniques and simultaneously taking account into the characteristics of sinusoidal fringes to improve the measurement accuracy. Although some improvements have been attained, the optimization process is time-consuming, which brings about inflexibility in practice. Additionally the improvement is still limited for the small period fringe pattern.

To extend larger depth range relative to existing defocusing projection techniques, Ekstrand et.al [16] build a 3D measurement system that projects the encoding patterns in nearly focused state for purpose of enabling the depth range to approximate the conventional sinusoidal fringe projection. In their work, the fringe periods 36~60pixels are selected to demonstrate the results, and finally concludes that this scheme has the potential to rapidly perform high quality measurement in the case of nine-step phase-shifting algorithm and the fringe period of 36pixels. When the fringe period is larger or smaller, relevant theoretical or experimental results are not provided. As a matter of fact, the contrast of high-frequency fringe is very low because of projector defocus and optical transfer characteristic of the system.

What is noticeable that the bit-decomposition based binary encoding method provided by Ayubi et.al [17, 18] that an 8 bit grayscale sinusoidal pattern is decomposed into 8 binary fringe patterns, which then are respectively projected with the projector completely focused. Subsequently captured 8 images after binaryzation are synthesized by weighted summation to form one 8 bit grayscale sinusoidal fringe image. It is a good idea for high-quality 3D measurement due to its ability of high projection and free-nonlinearity from projector, and the synthesized image enjoys good sinusoidality.

In a word, most of existing binary encoding fringes projection techniques are still rather limited comparing with the conventional sinusoidal fringe pattern projection in terms of measurement capability [6]: (1) the measurement depth and the 3D reconstruction accuracy are not concomitant: sinusoidality closely associated with the measurement accuracy requires defocus projection at some level at the expense of measurement depth; (2) the measurement accuracy is inevitably discounted from the erosion of high-frequency harmonics; (3) system calibration is more involved because most of existing calibration techniques require the projector to be in focus; and (4) it is difficult to simultaneously achieve high-quality sinusoidal fringe patterns for different periods under the same defocusing degree.

In [19], the authors described the basic principle of temporal-spatial binary fringe encoding method (TSBE) for purpose of expecting to circumvent these issues mentioned above, and evaluated its good phase accuracy almost over different fringe periods by simulations and experiments. Simultaneously, its simplicity and effectivity in implementing encoding without consuming-time optimization like [12] was presented. However, real 3D measurement performance as well as real projection and imaging strategy of TSBE patterns are not involved.

In this research, the encoding is implemented with an elaborately-selected diffusion matrix in order to achieve more linear error diffusion. We build synchronization control strategy amongst cameras and projector, temporal-integration imaging strategy, and develop a binocular 3D measurement system that permit these binary encoding fringe patterns to be in-focus and high-speed projected onto the surface of tested object and further yields one sinusoidal fringe image, making the application of the proposed method as well as the developed 3D measurement system feasible and foreseeable. Finally, we hammer at offering a feasible solution concerning high-quality 3D reconstruction with moderate measurement accuracy and measurement speed, and maintaining large measurement volume comparable to conventional sinusoidal fringe projection scheme. The measurement accuracy and reliability is systematically evaluated, yielding measurement accuracy better than ~0.08mm within calibrated volume.

2. Principle

The sinusoidal fringe pattern has been widely applied in most of 3D surface measurement techniques because of its high-accuracy and reliability. However, the nonlinearity of the availably off-the-shelf projector inevitably leads to undesired nonsinusoidality, which deteriorates the measurement accuracy in views of principle and source. Generally, periodic texture can be easily found on the reconstructed surface. The nonlinearity response (gamma γ) of the sinusoidal fringe pattern caused by the projector can be described as:

I(u,v)={C(u,v)+D(u,v)cos(2πu/P)}γ
Where C(u,v), D(u,v) and P respectively represent the average intensity, the intensity modulation and the fringe period. It is easily found that the frequency of sinusoidal fringe pattern is multiple while the phase acquisition (see Sec. 2.3) still supposes that the frequency is single, thereby causes additional measurement error. Reducing or removing such error usually requires fussy nonlinearity correction or adopt more phase-shifting steps. Undoubtedly, this will aggravate the measurement complexity.

2.1 Encoding strategy

The TSBE method inherits the concept of temporal binary encoding and spatial binary encoding. Temporal binary encoding is referred to encoding along the time axis while the spatial binary encoding means the encoding carried out within an image and the currently processed pixel is only associated with its adjacent pixels [1]. In view of decoding and measurement accuracy, temporal encoding generally enjoys better performance than spatial encoding [1].

In this paper, one computer-generated 8bit standard gray sinusoidal fringe pattern is encoded and obtaining more than two binary fringe patterns (we call these binary fringe pattern a sequence) using our proposed temporal-spatial binary encoding method. By in-focus projecting the sequence and subjecting to the inherent low-pass filtering effect of optical system, one experimental sinusoidal fringe image can be generated with temporal-integration imaging (refer to many temporal-spatial binary fringe patterns are simultaneously imaged on the imaging target), the flowchart is detailed in Fig. 1(a). The intensity-normalized sinusoidal fringe pattern to be encoded without nonlinearity disturbance can be represented as:

 figure: Fig. 1

Fig. 1 Flowchart of generation of sinusoidal fringe image (a) ; (b) the binary fringe pattern sequences with period 76pixels, K = 4 and phase shifting step N = 4 (only three periods width is visible for illustration); (c) sinusoidal fringe image.

Download Full Size | PDF

I(u,v)=C(u,v)+D(u,v)cos(2πu/P)

First of all, the normalized intensity is uniformly divided into K (K ≥2 and K is an integer) intervals. For example, when K = 2, the intensity intervals are [0, 0.5] and (0.5, 1]. Accordingly, per intensity interval will correspond to one binary fringe pattern. In general, we can obtain K binary fringe patterns {Bk} (k = 1, 2, 3… K), corresponding intensity intervals [(k-1)/K, k/K) (k = 1, 2,3,…, K) in case of K greater than 2.

The quantization threshold Tk for each interval is equal to the AVG between the maximum k/K and the minimum intensity (k-1)/K, which is determined by

Tk=2k12K(k=1,2,3,,K)

To linearly and uniformly allocate the quantization error to neighborhood pixels and avoid local accumulation, an symmetrical error diffusion kernel H is employed as the coefficients determining the proportion of quantization error allocation to neighborhood pixels L:

H=130[--*432343212321]
Where “-” and “•” respectively indicate the previously processed pixel and the pixel in process. Encoding-produced quantization error comes from the difference between the modified image including the previously processed pixels and the quantized results Bk(u, v) (See the description below Eq. (6)). The quantization error multiplying the corresponding coefficient in Eq. (4) is then spatially diffused to unprocessed neighborhood pixels. Whole process can be formulated as below

IMk(u,v)=Ik(u,v)+m,nL[H(m,n)Ek(um,vn)]

Concerning the temporal encoding, when spatial encoding for a given pixel is implemented within intensity interval k, the same pixel corresponding other intensity intervals will be encoded according to our following specially designed rules.

Generally, the quantization error for the pixel in process can be described as

Ek(u,v)={IMk(u,v)k/K,ifIMk(u,v)>TkIMk(u,v)(k1)/K,ifIMk(u,v)Tk
Meanwhile, the maximal quantization error as described above should be Qmax = 1/K. And the intensity intervals are [(k-1)/K, k/K), so that the first and last intensity ones are accordingly changed into [0-1/(2K), 1/K) and [1-1/K, 1 + 1/(2K)].

An example with K = 4 and phase shifting step N = 4 demonstrates the encoding process, the encoded binary fringe patterns marked with B1, B2, B3 and B4 are illustrated in Fig. 2(b). The normalized intensity I(u,v) is averagely divided into four intervals: [0-1/8,1/4), [1/4,1/2), [1/2,3/4) and [3/4,1 + 1/8] (the maximal quantization error Qmax = 1/8), corresponding to four binary fringe patterns B1, B2, B3 and B4 (all are initialized to be zeros) and quantization thresholds 1/8, 3/8, 5/8, 7/8 according to Eq. (3).

 figure: Fig. 2

Fig. 2 (a)framework of binocular vision based 3D measurement system using TSBE structured illumination; (b)experimental system; (c) the calibration target. Visualization 1 shows the in-focus projection at ~10Hz. Visualization 2 shows the fast scanning at 480Hz.

Download Full Size | PDF

Interval 1. For the first interval [0-1/8,1/4), when the intensity of the current pixelIM1(u,v)is greater than 1/4, the corresponding pixel B1(u, v) is assigned as 1, otherwise 0. Meanwhile, the corresponding pixels of the remaining three binary fringe patterns B2(u, v)~B4(u, v) are 0 (we call it temporal encoding). The quantization error is determined by

E1(u,v)={IM1(u,v)1/4,ifIM1(u,v)>1/8IM1(u,v)0,ifIM1(u,v)1/8

Interval 2. For the second interval [1/4,1/2), if the intensity of current pixelIM2(u,v)is greater than 3/8, the pixel B2(u, v) is equal to 1, otherwise 0. And the quantization error:

E2(u,v)={IM2(u,v)1/2,ifIM2(u,v)>3/8IM2(u,v)1/4,ifIM2(u,v)3/8

Accordingly, the corresponding pixel for one of the remaining three binary fringe patterns, B1(u, v), B3(u, v) and B4(u, v) is equal to 1, regardless of B2(u, v) being 0 or 1.

Interval 3. As to the third interval [1/2,3/4), whenIM3(u,v)is greater than 5/8, the corresponding pixel B3(u, v) is assigned as 1, otherwise 0. The quantization error is

E3(u,v)={IM3(u,v)3/4,ifIM3(u,v)>5/8IM3(u,v)1/2,ifIM3(u,v)5/8

Meanwhile the corresponding pixels for any two of the remaining binary fringe patterns, B1(u, v), B2(u, v)and B4(u, v) are 1.

Interval 4. Concerning the final interval [3/4,1 + 1/8], if IM4(u,v)is greater than 7/8, B4(u, v) is assigned as 1, otherwise 0, followed by the corresponding pixels for the remaining three binary fringe patterns, B1(u, v)~B3(u, v) being 1. In this case, the quantization error can be computed according to

E4(u,v)={IM4(u,v)1,ifIM4(u,v)>7/8IM4(u,v)3/4,ifIM4(u,v)7/8
After finishing the encoding, binary fringe pattern sequence {Bk} (k = 1, 2… K) is obtained. Since the resolution of encoded sinusoidal fringe pattern including the resolution of projector and 3/4 fringe period, therefore the phase-shifting binary fringe patterns can be cutting from the encoded binary fringe patterns similar to spatial phase shifting. Figure 1(c) shows the “captured” sinusoidal fringe image (which is really generated by computer to simulate the imaging process) with temporal-integration imaging.

2.2 System calibration

There are several factors affecting the calibration accuracy, such as the world coordinate accuracy of calibration target, the extraction accuracy of feature points and the number of view orientations required to capture one image. Besides, the calibration model is also vital to acquire the intrinsic and extrinsic parameters of optical 3D measurement system. The calibration technique proposed by Zhang is widely adopted and accepted due to its easy operation and high accuracy with a easily available plane calibration target [20]. The calibration method is implemented in our developed experimental system. And the Matlab calibration toolbox [21] is used to extract the system parameters.

2.3 3D reconstruction

Famous phase measurement profilometry based on phase shifting technique has been widely applied in many fields because of its high accuracy and reliability. In general, the more steps, the higher the accuracy can achieve. For an N-step phase-shifting algorithm, the kth frame of captured sinusoidal fringe image can be mathematically described as

Imc(u,v)=C(u,v)+D(u,v)cos[φ(u,v)+2mπ/N],(m=0,2,3,...,N1.)
And the corresponding wrapped phase map φw(u,v) can be retrieved by phase shifting algorithm:
φw(u,v)=tan1[m=0NImc(u,v)sin(2mπ/N)m=0NImc(u,v)cos(2mπ/N)]
Then, based on the analysis and conclusion in [22] that multi-frequency temporal phase unwrapping provides the best unwrapping reliability, and balancing the resolution and phase shifting steps we select three frequencies temporal phase unwrapping algorithm with fringe numbers 1, n2, n3, in which N frames of phase-shifting patterns with three optimum frequencies are modeled. The absolute phase maps can be calculated as below:
{φ1(u,v)=φw1(u,v),φ2(u,v)=φw2(u,v)+2π×{round[n2φ1(u,v)φw2(u,v)2π]},φ3(u,v)=φw3(u,v)+2π×{round[n3/n2φ2(u,v)φw3(u,v)2π]}.
The absolute phase map φ3(u,v) associated with the highest frequency (corresponding fringe number n3) is used as the matching of homogeneous points for 3D reconstruction. Next section we will take advantage of TSBE method and 3D experimental framework to experimentally test and verify the feasibility.

3. Experiment

The built 3D experimental framework based on TSBE structured illumination has been presented in Fig. 2(a), following which we develop an experimental system (Fig. 2(b)) with binocular vision principle to demonstrate the proposed encoding approach and synchronization control strategy. This system mainly consists of a DMD based digital projector with resolution 912 × 1140pixels to in-focus project the binary encoding fringe pattern sequences (which can be visualized in Visualization 1 at projection rate ~10Hz. The optimum work distance is about 750mm apart from the projector, where the projector and two cameras are all completely focused.) at 480Hz (Visualization 2), and two color CMOS cameras (model: IDS 3060; resolution: 1936 × 1216pixels; nominal frame rate up to 126Hz) to synchronously capture the deformed sinusoidal fringe images by means of temporal-integration imaging in accordance with the well-established control strategy with a self-developed synchronization control circuit at 120Hz, and a computer for 3D reconstruction. The focus length of the imaging lens (model: RICOH FL-CC1614-2M) is 16mm. For purpose of retrieving the absolute phase map, three groups of computer-generated standard sinusoidal fringe patterns with fringe numbers 1, 12, 76 are encoded with the TSBE method in Sec.2.1, and the phase-shifting fringe patterns are obtained by transversely cutting the encoded binary fringe patterns, so that the fringe periods are exactly divisible by phase-shifting step. Each sinusoidal fringe pattern is decomposed into four (K = 4) binary fringe patterns and totally 48 binary fringe patterns are required to project.

The experimental system is firstly calibrated using Zhang's method with a circle dot calibration target as shown in Fig. 2(c). The calibration depth is arranged from 600 to 900mm and totally calibration target with 18 spatial poses is simultaneously captured by the two cameras to parameterize the systematic geometry. In order to evaluate the accuracy [23], a circular ceramic plate (diameter: 120mm, flatness less than 0.01mm) and dumbbell gauge (sphere radii R1 = 25.393mm, R2 = 25.390mm; sphere distance 100.018mm) as the standard tested components, followed by 3D measurement experiments of objects with mask and palm for practical performance.

3.1 Accuracy evaluation with standard components: ceramic plate and dumbbell gauge

The temporal-spatial binary fringe patterns are projected onto the ceramic plate, which is randomly placed at seven different positions and poses for the sake of the local measurement accuracy of plane [23]. On each position, the captured sinusoidal fringe images are analyzed by four-step phase-shifting and three frequencies temporal phase unwrapping algorithm to retrieve the absolute phase map (see Sec.2.3). Combined with the calibration result, the 3D surface of ceramic plate can be reconstructed. Next, a fitting operation is implemented to acquire the discrepancy associated with each ordinate as the difference between the fitting result and the measured result. The maximum error and standard deviation for all positions are calculated. Table 1 summarizes the statistical results of all discrepancies. These results demonstrate that the proposed method as well as experimental scheme is capable to achieve a satisfactory accuracy with standard derivation less than 0.055mm and maximum error 0.310mm in the calibrated volume (600~900mm).

Tables Icon

Table 1. Evaluation results of ceramic plate (Unit: mm)

Another experiment using a standard component: dumbbell gauge for further evaluating the accuracy of the proposed encoding method and experimental scheme is carried out. Similar to the test of ceramic plate, the binary encoding patterns sequences are projected onto the dumbbell gauge at ten different positions and poses [23], whose least-square radii of 3D reconstruction results are calculated. Table 2 lists the sphere radius errors (R1m-R1), (R2m-R2), the sphere distance error (dm-d) between measured results and nominal values as well as their statistical. error. These results show that the experimental system has the ability of achieving sphere radius errors less than 0.07mm and sphere distance error less than ~0.08mm.

Tables Icon

Table 2. Evaluation results of dumbbell gauge (Unit: mm)

What needs be noted: (1) at the optimum measurement position (~750mm), the measurement area is about 400mm × 300mm; (2) when the tested dumbbell gauge gets too close to the projector (Position 620mm in Table 2), the slight local overexposure on its surface (the ceramic plate is not affected for its diffuse reflection surface characteristic) will cause some inevitable measurement error. And the influence depends on spatial positions. Still, these statistical results of Tables 2 confirm that the system can meet most of application situations.

3.2 Object test

In consideration of the work way of our developed system, the tested object with static status is generally easy to achieve, for example, a mask mounted on a holder. In order to scan the object not easy to hold static and show the fast scanning ability, a real human palm is also tested. The captured deformed fringe images (one of phase-shifting fringe images of three groups of frequencies) of two tested objects are respectively shown in Fig. 3 and Fig. 4, their 3D reconstruction results from different perspectives are respectively displayed in Fig. 5 and Fig. 6. For better observation, the given 3D reconstruction results (Point cloud) are rendered. Specifically, we emphasize that the 3D results are not subject to any post-processing, for example, interpolation or smoothing.

 figure: Fig. 3

Fig. 3 (a)-(c)captured deformed fringe images from left camera and (a')-(c') from right camera. The tested object is a mask placed on a holder. The work distance is about 810mm from the projector.

Download Full Size | PDF

 figure: Fig. 4

Fig. 4 Captured deformed fringe images. (a)-(c) from left camera and (a')-(c') from right camera. The tested object is the author's palm. The work distance is 700mm from the projector.

Download Full Size | PDF

 figure: Fig. 5

Fig. 5 3D reconstruction results (point cloud without any post-processing) of tested mask observed from different perspectives.

Download Full Size | PDF

 figure: Fig. 6

Fig. 6 3D reconstruction results (point cloud without any post-processing) of the author's palm with different perspectives.

Download Full Size | PDF

Compared with the 3D models shown in Fig. 5 and Fig. 6, it is not difficult to observe that the surface of the former is slightly smooth than that of the latter. The reason for this is that the mask is covered with the omnipresent white busts while the real palm is a natural object. Nonetheless, the details “Ω” shaped on the mask (see Fig. 7(a)) and palmprint on the palm (see Fig. 7(b)) are still visible. Besides, no periodical texture is existent on the 3D models, indicating that the nonlinearity of projector has no influence on measurement results.

 figure: Fig. 7

Fig. 7 Details of 3D reconstruction results of mask and palm. (a) Mask and (b) palm. The details of “Ω” shaped on the mask and palmprint on the palm are still visible.

Download Full Size | PDF

4. Discussion

During the implement process of the proposed method and experimental procedure, there are several aspects needing to explain and discuss:

  • (1). The measurement accuracy Prior to the system calibration, we suppose that the feature intervals or world coordinates concerning the employed calibration target are exact but actually exists certain deviations, which may have the possibility of yielding unsatisfactory system geometry parameter and imprecise measurement result. If the advanced bundle adjustment (BA) strategy [24] is adopted to optimize the calibration parameters, better performance may be acquired. We will commit ourselves to further the calibration accuracy and obtain better measurement results in future work.
  • (2). The synchronous control between projector and cameras For the current control strategy, the frame rate of projector is quadruple of the capture frame of cameras, their asynchronization certainly lead to the failure of 3D reconstruction. An alternative solution is to inert one or two black patterns between every four (K) binary encoding fringe patterns.
  • (3). The temporal-integration imaging It should be emphasized that our encoding tactic and experimental scheme differ from [17]. In [17], since totally 96 binary fringe patterns are required to implement a 3D measurement on condition of four steps phase shifting and temporal phase unwrapping algorithm, which will expend much projection time. Besides, the generation of sinusoidal fringe image by weighted summation may give rise to some difficulty of using temporal-integration imaging due to different exposure times amongst projected patterns, which in turn requires expensive high-speed camera to one by one capture the projection binary pattern sequences if improving the scanning time. Although subsequently the authors combined this encoding scheme with color encoding to reduce the projection time and the number of captured images [18], it appears that the experimental process did not perform as successfully as expected in virtue of color crosstalk.

    Certainly, in our scheme, higher quality 3D data will be acquired if one single sinusoidal fringe pattern is encoded into more temporal-spatial binary patterns. In other words, the larger K, the better sinusoidality can be attained. Our previous publication in [19] that one sinusoidal fringe image was computer-generated using these images captured by camera in manner of frame by frame imaging projection patterns, which will indeed create some smoothing effect (for example reduce the influence of noise) and improve the measurement. Instead of this strategy, in this paper one single sinusoidal fringe image is directly generated with temporal-integration imaging projected sequence. Considering the measurement time and measurement accuracy, K = 4 is selected. The systematic inherent low-pass filter can directly contribute to good sinusoidality.

  • (4). The performance of 3D reconstruction Theoretically, the developed projector can project binary patterns at frame rate of ~4000Hz, which enables the 3D reconstruction very efficiently. However due to the limited frame rate of cameras used, only 120Hz is usefully available, leading to 3D reconstruction rate of 10Hz around (off-line). Compared with the defocused projection scheme in [13] with 556Hz high-speed 3D reconstruction, the measurement efficiency of the system in this paper is much lower. The reasons are, on the one hand, the projection rate of [13] scheme reaches up to 5000Hz and camera with nominal 1000Hz at resolution 1632 × 1200 pixels (5000Hz at resolution 576 × 576pixels for experiment); on the other hand, the nominal frame rate of cameras used in our experiments is only 126Hz at resolution 1936 × 1216pixels (only 120Hz at full resolution is experimentally employed), which limits the acquisition speed of sinusoidal fringe image; thirdly, the encoding principle and phase-shifting steps are different. The encoding category in [13] is of spatial binary encoding that needs defocus projection to yield sinusoidal fringe pattern within limited measurement depth or work distance), while the proposed method is of temporal and spatial binary encoding which is done in-focus projection. The number (9 frames, three steps phase-shifting) of required projected pattern of the former is much less than the latter (48 frames, four steps phase-shifting). Therefore by improving the experimental conditions, it can be believed that the proposed approach will have better performance in balancing among the accuracy and the speed.

5. Conclusion

Summarily, aiming at avoiding the nonlinearity of off-the-shelf digital projectors and balancing the measurement accuracy and speed, a temporal-spatial binary encoding method is presented to encode the computer-generated standard sinusoidal fringe pattern, it is simple and feasible in principle and time-consuming optimization is not involved. On the basis of the built experimental framework as well as designed temporal-integration imaging strategy, high-quality sinusoidal fringe image can be formed. The measurement results of ceramic plate and dumbbell gauge indicate that the proposed encoding method as well as the experimental scheme provide a good accuracy within the calibrated volume. Finally, the 3D reconstruction performance at 10Hz (off-line) of the system with a mask and the author's palm are demonstrated.

Funding

National Major Instrument Special Fund (2013YQ490879); China Post-doctoral Special Funds (2016T90847); China Post-doctoral Science Fund (2015M572472);The research project of Chengdu University of Information Technology (KYTZ201619); National Natural Science Foundation of China (61402307).

References and links

1. J. Geng, “Structured-light 3D surface imaging: a tutorial,” Adv. Opt. Photonics 3(2), 128–160 (2011). [CrossRef]  

2. S. S. Gorthi and P. Rastogi, “Fringe projection techniques: whither we are?” Opt. Lasers Eng. 48(2), 133–140 (2010). [CrossRef]  

3. S. Zhang, “Recent progresses on real time 3D shape measurement using digital fringe projection techniques,” Opt. Lasers Eng. 48(2), 149–158 (2010). [CrossRef]  

4. H. Guo, H. He, and M. Chen, “Gamma correction for digital fringe projection profilometry,” Appl. Opt. 43(14), 2906–2914 (2004). [CrossRef]   [PubMed]  

5. T. Hoang, B. Pan, D. Nguyen, and Z. Wang, “Generic gamma correction for accuracy enhancement in fringe-projection profilometry,” Opt. Lett. 35(12), 1992–1994 (2010). [CrossRef]   [PubMed]  

6. B. W. Li, Y. J. Wang, J. F. Dai, W. Lohry, and S. Zhang, “Some recent advances on superfast 3D shape measurement with digital binary defocusing techniques,” Opt. Lasers Eng. 54(1), 236–246 (2014). [CrossRef]  

7. C. Zuo, Q. Chen, G. H. Gu, S. J. Feng, F. X. Feng, R. B. Li, and G. C. Shen, “High-speed three-dimensional shape measurement for dynamic scenes using bi-frequency tripolar pulse width modulation fringe projection,” Opt. Lasers Eng. 51(8), 953–960 (2013). [CrossRef]  

8. W. Lohry and S. Zhang, “3D shape measurement with 2D area modulated binary patterns,” Opt. Lasers Eng. 50(7), 917–921 (2012). [CrossRef]  

9. Y. Wang and S. Zhang, “Three-dimensional shape measurement with binary dithered patterns,” Appl. Opt. 51(27), 6631–6636 (2012). [CrossRef]   [PubMed]  

10. Y. Wang and S. Zhang, “Optimal pulse width modulation for sinusoidal fringe generation with projector defocusing,” Opt. Lett. 35(24), 4121–4123 (2010). [CrossRef]   [PubMed]  

11. J. Dai and S. Zhang, “Phase-optimized dithering technique for high-quality 3D shape measurement,” Opt. Lasers Eng. 51(6), 790–795 (2013). [CrossRef]  

12. W. Lohry and S. Zhang, “Genetic method to optimize binary dithering technique for high-quality fringe generation,” Opt. Lett. 38(4), 540–542 (2013). [CrossRef]   [PubMed]  

13. Y. Wang and S. Zhang, “Superfast multifrequency phase-shifting technique with optimal pulse width modulation,” Opt. Express 19(6), 5149–5155 (2011). [CrossRef]   [PubMed]  

14. X.-Y. Su, W.-S. Zhou, G. von Bally, and D. Vukicevic, “Automated phase-measuring profilometry using defocused projection of a Ronchi grating,” Opt. Commun. 94(6), 561–573 (1992). [CrossRef]  

15. G. A. Ayubi, J. A. Ayubi, J. M. D. Martino, and J. A. Ferrari, “Pulse-width modulation in defocused 3D fringe projection,” Opt. Lett. 35(21), 3682–3684 (2010). [CrossRef]   [PubMed]  

16. L. Ekstrand and S. Zhang, “Three-dimensional profilometry with nearly focused binary phase-shifting algorithms,” Opt. Lett. 36(23), 4518–4520 (2011). [CrossRef]   [PubMed]  

17. G. A. Ayubi, J. M. Di Martino, J. R. Alonso, A. Fernández, C. D. Perciante, and J. A. Ferrari, “Three-dimensional profiling with binary fringes using phase-shifting interferometry algorithms,” Appl. Opt. 50(2), 147–154 (2011). [CrossRef]   [PubMed]  

18. G. A. Ayubi, J. M. Di Martino, J. R. Alonso, A. Fernández, J. L. Flores, and J. A. Ferrari, “Color encoding of binary fringes for gamma correction in 3-D profiling,” Opt. Lett. 37(8), 1325–1327 (2012). [CrossRef]   [PubMed]  

19. J. P. Zhu, X. Y. Su, Z. S. You, and Y. K. Liu, “Temporal-spatial encoding binary fringes toward three-dimensional shape measurement without projector nonlinearity,” Opt. Eng. 54(5), 054108 (2015). [CrossRef]  

20. Z. Y. Zhang, “A Flexible New Technique for Camera Calibration,” IEEE Trans. Pattern Anal. Mach. Intell. 22(11), 1330–1334 (2000). [CrossRef]  

21. http://www.vision.caltech.edu/bouguetj/calib_doc/.

22. C. Zuo, L. Huang, M. L. Zhang, Q. Chen, and A. Asundi, “Temporal phase unwrapping algorithms for fringe projection profilometry: A comparative review,” Opt. Lasers Eng. 85, 84–103 (2016). [CrossRef]  

23. VDI/VDE 2634 Blatt 2: 2002–08 Optische 3D-Messsysteme;Systeme mit flachenhafter Antastung.Berlin:Beuth Verlag.

24. Y. L. Xiao, X. Su, W. Chen, and Y. Liu, “Three-dimensional shape measurement of aspheric mirrors with fringe reflection photogrammetry,” Appl. Opt. 51(4), 457–464 (2012). [CrossRef]   [PubMed]  

Supplementary Material (2)

NameDescription
Visualization 1: MP4 (1040 KB)      in-focus project the binary encoding fringe pattern sequences at projection rate ~10Hz
Visualization 2: MP4 (1251 KB)      fast scanning at 480Hz

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (7)

Fig. 1
Fig. 1 Flowchart of generation of sinusoidal fringe image (a) ; (b) the binary fringe pattern sequences with period 76pixels, K = 4 and phase shifting step N = 4 (only three periods width is visible for illustration); (c) sinusoidal fringe image.
Fig. 2
Fig. 2 (a)framework of binocular vision based 3D measurement system using TSBE structured illumination; (b)experimental system; (c) the calibration target. Visualization 1 shows the in-focus projection at ~10Hz. Visualization 2 shows the fast scanning at 480Hz.
Fig. 3
Fig. 3 (a)-(c)captured deformed fringe images from left camera and (a')-(c') from right camera. The tested object is a mask placed on a holder. The work distance is about 810mm from the projector.
Fig. 4
Fig. 4 Captured deformed fringe images. (a)-(c) from left camera and (a')-(c') from right camera. The tested object is the author's palm. The work distance is 700mm from the projector.
Fig. 5
Fig. 5 3D reconstruction results (point cloud without any post-processing) of tested mask observed from different perspectives.
Fig. 6
Fig. 6 3D reconstruction results (point cloud without any post-processing) of the author's palm with different perspectives.
Fig. 7
Fig. 7 Details of 3D reconstruction results of mask and palm. (a) Mask and (b) palm. The details of “Ω” shaped on the mask and palmprint on the palm are still visible.

Tables (2)

Tables Icon

Table 1 Evaluation results of ceramic plate (Unit: mm)

Tables Icon

Table 2 Evaluation results of dumbbell gauge (Unit: mm)

Equations (13)

Equations on this page are rendered with MathJax. Learn more.

I ( u , v ) = { C ( u , v ) + D ( u , v ) cos ( 2 π u / P ) } γ
I ( u , v ) = C ( u , v ) + D ( u , v ) cos ( 2 π u / P )
T k = 2 k 1 2 K ( k = 1 , 2 , 3 , , K )
H = 1 30 [ - - * 4 3 2 3 4 3 2 1 2 3 2 1 ]
I M k ( u , v ) = I k ( u , v ) + m , n L [ H ( m , n ) E k ( u m , v n ) ]
E k ( u , v ) = { I M k ( u , v ) k / K , i f I M k ( u , v ) > T k I M k ( u , v ) ( k 1 ) / K , i f I M k ( u , v ) T k
E 1 ( u , v ) = { I M 1 ( u , v ) 1 / 4 , i f I M 1 ( u , v ) > 1 / 8 I M 1 ( u , v ) 0 , i f I M 1 ( u , v ) 1 / 8
E 2 ( u , v ) = { I M 2 ( u , v ) 1 / 2 , i f I M 2 ( u , v ) > 3 / 8 I M 2 ( u , v ) 1 / 4 , i f I M 2 ( u , v ) 3 / 8
E 3 ( u , v ) = { I M 3 ( u , v ) 3 / 4 , i f I M 3 ( u , v ) > 5 / 8 I M 3 ( u , v ) 1 / 2 , i f I M 3 ( u , v ) 5 / 8
E 4 ( u , v ) = { I M 4 ( u , v ) 1 , i f I M 4 ( u , v ) > 7 / 8 I M 4 ( u , v ) 3 / 4 , i f I M 4 ( u , v ) 7 / 8
I m c ( u , v ) = C ( u , v ) + D ( u , v ) cos [ φ ( u , v ) + 2 m π / N ] , ( m = 0 , 2 , 3 , ... , N 1. )
φ w ( u , v ) = tan 1 [ m = 0 N I m c ( u , v ) sin ( 2 m π / N ) m = 0 N I m c ( u , v ) cos ( 2 m π / N ) ]
{ φ 1 ( u , v ) = φ w 1 ( u , v ) , φ 2 ( u , v ) = φ w 2 ( u , v ) + 2 π × { r o u n d [ n 2 φ 1 ( u , v ) φ w 2 ( u , v ) 2 π ] } , φ 3 ( u , v ) = φ w 3 ( u , v ) + 2 π × { r o u n d [ n 3 / n 2 φ 2 ( u , v ) φ w 3 ( u , v ) 2 π ] } .
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.