Abstract
The monotonicity of depth in a geometric constraint based absolute phase unwrapping is analyzed and a monotonic discriminant of Δ(uc,vc) is presented in this paper. The sign of the discriminant determines the distance selection for the virtual plane to create the artificial absolute phase map for a given structured light system. As Δ(uc,vc) ≥ 0 at an arbitrary point on the CCD pixel coordinates the minimum depth distance is selected for the virtual plane, and the maximum depth distance is selected as Δ(uc,vc) ≤ 0. Two structured light systems with different signs of the monotonic discriminant are developed and the validity of the theoretical analysis is experimentally demonstrated.
© 2020 Optical Society of America under the terms of the OSA Open Access Publishing Agreement
1. Introduction
Recently, three-dimensional (3D) shape measurement has become one of the most active research areas in optical metrology [1]. Among 3D shape measurement techniques, the structured light method possesses advantages of fast-speed, high-accuracy and non-destruction, and has been applied in diverse fields [2–4], including reverse engineering, hyperspectral imaging, and microscopic measurement.
In structured light technology, the phase recovery from the fringe patterns is required and the main popular methods include phase-shifting algorithm [5], Fourier transform method [6], windowed Fourier transform [7] and deep learning [8]. By the above fringe analysis methods only wrapped phase ranging from $-{\pi}$ to ${\pi}$ can be obtained, and the phase unwrapping is required to obtain a continuous phase distribution. Conventional phase unwrapping methods can be classified into two categories, the spatial phase unwrapping (SPU) and temporal phase unwrapping (TPU) [9]. SPU provides a relative phase by integrating the phase gradient over a path that covers the domain of interest. Although SPU has the advantages of simplicity and convenience, it fails in simultaneously measurement of multiple isolated objects. In contrast, TPU unwraps the wrapped phase by capturing additional images, and the retrieval of absolute phase is achieved. This method is competent for the measurement of multiple isolated objects and has the advantage of stronger suppression of noise than SPU. In a review paper, Zuo et al. [10] thoroughly studied three types of TPU methods, including multi-frequency, multi-wavelength and number-theoretical approaches.
Although TPU can retrieve the absolute unwrapped phase, it is not suitable for high-speed measurement applications due to the additional images’ acquisition. In order to solve aforementioned problems, An et al. [11] proposed an absolute phase unwrapping method based on geometric constraints of the structured light system. In the method, an artificial absolute phase map, ${\Phi _{\min }}$, at a given virtual depth plane $z = {z_{\min }}$, is firstly generated based on the internal and external parameters of the calibrated structured light system. Then, the fringe order of each pixel in the wrapped phase map is determined by computing the difference between the wrapped phase and the artificial absolute phase. Finally, with the determined fringe orders, the absolute phase is obtained from the wrapped phase. The geometric constraint method has the advantages of high-speed, simple system setup, simultaneous multiple objects’ measurement and robustness to noise. This method has also been applied to other phase recovery to improve the accuracy and speed of 3D measurement. For example, Hyun and Zhang [12] used this method to enhance two-frequency phase-shifting. Li et al. [13] proposed a single-shot method based on geometric constraint to overcome the limitation in Fourier transform profilometry and reduce the motion-induced error [14]. In recent years, researchers have also proposed many methods to improve its measurement depth [9,15,16].
However, we found in our actual experiments that the selection of closest depth distance for the virtual plane used in reference [11] to generate an artificial absolute phase map was not always correct, and for some systems the maximum depth distance should be selected. Therefore, how to determine the correct depth distance for the virtual plane in a given structured light system is the first important issue in the application of geometric constraint.
In order to solve this issue, in this paper, based on the theory of An et al. [11], the monotonicity of the depth $\textrm{z}({u^c},{v^c})$ with respect to projector pixel coordinate of ${u^p}$ by means of geometric constraint is theoretically investigated and a discriminant of $\Delta ({u^c},{v^c})$ determined by the internal and external parameters of the calibrated structured light system is presented. It is shown that as $\Delta ({u^c},{v^c}) \ge 0$ at arbitrary point on the CCD pixel coordinates, the minimum depth distance is selected for the virtual plane to create the artificial absolute phase map, and the maximum depth distance must be selected as $\Delta ({u^c},{v^c}) \le 0$. Several experiments with different structured light systems are conducted to verify the correctness of the proposed monotonic discriminant of $\Delta ({u^c},{v^c})$.
Section 2 explains principles of the proposed method. Section 3 presents several experimental results to validate the method. Section 4 gives the summary of this paper.
2. Principle
2.1. N-step phase-shifting algorithm
For the N-step phase-shifting algorithm with equal phase shifts, the nth fringe pattern can be described as
where ${I^{\prime}}(x,y)$ is the average intensity, ${I^{\prime\prime}}(x,y)$ is the intensity modulation, and $\phi (x,y)$ is the phase to be solved. Simultaneously solving the N equations by a least-squares method, $\phi (x,y)$ can be written as2.2. Structured light system model and calibration
In structured light system, a well-known linear pinhole model is usually used to describe the imaging process of a camera. Its mathematical expression is as follows,
where ${s^c}$ is a scale factor, superscript c denotes camera, superscript w denotes the world coordinates, ${A^c}$ is the intrinsic matrix of the camera, ${R^c}$ and ${t^c}$ are the camera’s extrinsic matrices, and ${P^c}$ is a $3 \times 4$ projector matrix used to describe the process of projecting a point $({x^w},{y^w},{z^w})$ of the world coordinates to 2D image coordinates $({u^c},{v^c})$. The projector can be regarded as an inverse camera. Similarly, the pinhole model of the projector is as follows, where ${s^p}$ is a scale factor, superscript p denotes projector, ${A^p}$ is the intrinsic matrix of the projector, ${R^p}$ and ${t^p}$ are the projector’s extrinsic matrices, and ${P^p}$ is a $3 \times 4$ projector matrix used to describe the process of projecting a point $({x^w},{y^w},{z^w})$ of the world coordinates to DMD coordinates $({u^\textrm{p}},{v^p})$.In the actual calibration process, in order to make the projector be able to capture images, Zhang and Huang [18] proposed a phase assistant method to establish a one-to-one correspondence relationship between the camera and projector pixels using two sets of orthogonal sinusoidal fringe patterns. Then the calibration target images captured by CCD are mapped onto the DMD plane using this established correspondence.
2.3. Absolute phase unwrapping using geometric constraint
Once the structured light system is calibrated by the method described in section 2.2, the projector matrices ${P^c}$ and ${P^p}$ can be estimated. Then, there are six equations with seven unknowns $({s^c},{s^p},{x^w},{y^w},{z^w},{u^p},{v^p})$ provided by Eqs. (4) and (6). Generally, in order to acquire 3D shape information of an object, another constraint equation is provided by the retrieval of absolute phase from the captured fringe patterns. The projector pixel coordinates ${u^p}$ and ${v^p}$ can be determined by the following equations,
where ${\Phi ^V}$ and ${\Phi ^H}$ are the absolute phase maps along the vertical and horizontal directions, respectively, and $T$ is the fringe period.An et al. [11] proposed an alternative approach. It was suggested that if a virtual plane was placed at the closest distance from the camera coordinate system ${\textrm{z}^w} = z_{\min }^w$, the remaining six parameters could be solved uniquely. With Eq. (4), the correspondent $x_{\min }^w$ and $y_{\min }^w$ for each camera pixel $({u^c},{v^c})$ can be expressed as,
2.4. Monotonicity analysis
In this part, we theoretically investigate the monotonicity of absolute phase unwrapping by geometric constraint in detail, and a monotonic discriminant is derived which is used to determine the correct depth distance of the virtual plane for a given structured light system.
Since ceil is a function that gives the nearest upper integer number, the unwrapped phase $\Phi ({u^c},{v^c})$ is always greater than the generated phase $\Phi _{\min }^V({u^c},{v^c})$ for arbitrary point on the CCD pixel coordinates, and the mathematical expression is as
Moreover, the depth distance of arbitrary point at the tested object is always greater than this known closest distance, i.e., Obviously, if the two conditions of Eqs. (14) and (15) are simultaneously true for arbitrary point on the CCD pixel coordinates, the measurement depth of $z({u^c},{v^c})$ must be a monotonically increasing function as unwrapped phase of $\Phi ({u^c},{v^c})$. As shown in Eq. (8), the absolute phase of ${\Phi ^V}({u^c},{v^c})$ monotonically increases as the projector pixel of ${u^p}$. Therefore, to know the monotonicity of depth $z({u^c},{v^c})$ as $\Phi ({u^c},{v^c})$, we only need to investigate the variation of depth $z({u^c},{v^c})$ with respect to ${u^p}$, i.e. Solving Eqs. (4) and (6) simultaneously leads toThe discriminant gives the insight that there may be a monotonically decreasing form of the depth distance of $z({u^c},{v^c})$ with respect to the projector pixel coordinate of ${u^p}$. Obviously, the maximum depth distance, rather than the minimum depth distance, should be selected to generate the absolute phase of ${\Phi _{\max }}({u^c},{v^c})$ for the phase unwrapping in this case. In a summary, as $\Delta ({u^c},{v^c}) \ge 0$ at arbitrary point on the CCD pixel coordinates the minimum depth distance is selected for the virtual plane to create the artificial absolute phase map, and the maximum depth distance is selected as $\Delta ({u^c},{v^c}) \le 0$.
3. Experiment
To verify the performance of the proposed discriminant, we develop a structured light system, shown in Fig. 1. The hardware system includes a single CCD (DAHENG MER-131-210U3M-L) and a projector (DLP6500). The camera’s resolution is $\textrm{1024} \times \textrm{1280}\; pixels$ and the projector’s resolution is $\textrm{1080} \times \textrm{1920}\; pixels$. The camera has a focal length of $\textrm{12}\textrm{.5 }mm$ and sensor unit size of $\textrm{4}\textrm{.8} \mu m \times \textrm{4}\textrm{.8} \mu m$. The projector has a focal length of $\textrm{23}\; mm$ and sensor unit size of $7.6 \mu m \times 7.6 \mu m$.
In this paper, the structured light system is calibrated by the methods proposed in reference [19]. The calibration results of the system are as follows. The intrinsic parameter matrices for the camera and the projector are, respectively,
In the 3D measurement experiments, a sculpture shown in Fig. 3(a) is measured by the three-frequency phase-shifting approach firstly. We adopt nine-, five-, and five-step phase-shifting method for ${T_1} = 80\; pixels$, ${T_2} = 120\; pixels$ and ${T_3} = 270\; pixels$, respectively. Figure 3(b) shows the acquired wrapped phase from the fringe patterns of ${T_1} = 80\; pixels$. The 3D measurement result of the sculpture is shown in Fig. 3(c), where the minimum and maximum distances of this sculpture depth are ${z_{\min }} = 546.7465\; mm$ and ${z_{\max }} = 595.6813\; mm$, respectively.
Then, the minimum and maximum distances of the object are separately adopted to generate the artificial phase map for the absolute phase unwrapping by geometric constraint. To be concrete, the artificial phase map of ${\Phi _{\min }}$ is created at the virtual plane of $z = 546\; mm$, as described in section 2.3. The wrapped phase shown in Fig. 3(b) is then unwrapped by substituting the minimum phase map into Eq. (13) and the reconstructed 3D shape of the sculpture is shown in Fig. 4(a) where the minimum and maximum depths of the sculpture are ${z_{\min }} = 546.7465\; mm$ and ${z_{\max }} = 595.6813\; mm$, respectively. This result is the same as that obtained by the three-frequency phase-shifting approach. In a comparison, as the artificial phase map is created at the virtual plane of $z = 596\; mm$ for the absolute phase unwrapping, the reconstructed 3D shape is as Fig. 4(b), where the minimum and maximum depths are ${z_{\min }} = {629}{.7983}\; mm$ and ${z_{\max }} = 69\textrm{7}.\textrm{218}6\; mm$, respectively, different from that obtained by the three-frequency phase-shifting approach. The above experimental results prove that as $\Delta ({u^c},{v^c}) \ge 0$ at arbitrary point on the CCD pixel coordinates the minimum depth distance should be selected for the virtual plane to create the artificial absolute phase map by geometric constraint approach.
To more intuitively verify the above conclusion, we randomly select a row on the CCD pixel coordinates and then compare the depth distributions at this row in Figs. 3(c), 4(a) and 4(b). The results are shown in Fig. 5 where the red dashed line, the blue solid line and the black solid line represent the depth distributions at the row in Figs. 3(c), 4(a) and 4(b), respectively. Obviously, the red dashed line, the blue solid line are overlapping well, and the black solid line deviates upwards from the red dashed line, further verifying that as $\Delta ({u^c},{v^c}) \ge 0$ at arbitrary point on the CCD pixel coordinates the minimum depth distance should be selected for the virtual plane to create the artificial absolute phase map.
For the case of monotonic decrease of depth distance, we develop another structured light system using the same hardware equipment as shown in Fig. 6.
The calibration of the system is conducted and the result is as follows. The intrinsic parameter matrices for the camera and the projector are, respectively,
We then apply this structured light system to the 3D shape measurements of the sculpture by different phase unwrapping approaches. Figure 8(a) shows the reconstructed 3D shape by three-frequency phase-shifting approach, in which the minimum and maximum depth values are ${z_{\min }} = 517.9246\; mm$ and ${z_{\max }} = 579.4308\; mm$, respectively. Figure 8(b) shows the reconstructed 3D shape by geometric constraint approach with the artificial phase map of ${\Phi _{\min }}$ created at the virtual plane of $z = 517\; mm$. The minimum and maximum depths are ${z_{\min }} = 463.6201\; mm$ and ${z_{\max }} = 516.9674\; mm$, respectively. Obviously, these values are different from that obtained by three-frequency phase-shifting approach. On the contrary, as the artificial phase map of ${\Phi _{\max }}$ is created at the virtual plane of $z = 580\; mm$ for the absolute phase unwrapping, the reconstructed 3D shape is as Fig. 8(c), where the minimum and maximum depths are ${z_{\min }} = 517.9246\; mm$ and $ {z_{\max }} = 579.4308\; mm$, respectively. These values are the same as that obtained by three-frequency phase-shifting approach. The above experimental results prove that as $\Delta ({u^c},{v^c}) \le 0$ at arbitrary point on the CCD pixel coordinates the maximum depth distance should be selected to create the artificial absolute phase map by geometric constraint approach.
We also randomly select a row on the CCD pixel coordinates and then compare the depth distributions at this row in Figs. 8(a)–8(c). The results are shown in Fig. 9 where the red dashed line, the black solid line and the blue solid line represent the depth distributions at the row in Figs. 8(a)–8(c), respectively. It can be seen that the red dashed line and the blue solid line are overlapping well, and the black solid line deviates downwards from the red dashed line, further verifying that as $\Delta ({u^c},{v^c}) \le 0$ at arbitrary point on the CCD pixel coordinates the maximum depth distance should be selected for the virtual plane to create the artificial absolute phase map.
4. Summary
We have theoretically investigated the monotonicity of depth $z({u^c},{v^c})$ with respect to projector pixel coordinate of ${u^p}$ in the absolute phase unwrapping by geometric constraint, and presented a monotonic discriminant used to choose correct depth distance of the virtual plane to create the artificial absolute phase map in this paper. The monotonic discriminant indicates that as $\Delta ({u^c},{v^c}) \ge 0$ at arbitrary point on the CCD pixel coordinates, the minimum depth distance is selected for the virtual plane, and on the contrary, as $\Delta ({u^c},{v^c}) \le 0$ the maximum depth distance must be selected. Experimental results with different structured light systems demonstrate the validity of the theoretical analysis.
Funding
Scientific Instrument Developing Project of the Chinese Academy of Sciences (YJKYYQ20180067).
Disclosures
The authors declare no conflicts of interest.
References
1. S. Gorthi and P. Rastogi, “Fringe projection techniques: Whither we are?” Opt. Laser Eng. 48(2), 133–140 (2010). [CrossRef]
2. S. Heist, C. Zhang, K. Reichwald, P. Kuhmstedt, G. Notni, and A. Tunnermann, “5D hyperspectral imaging: fast and accurate measurement of surface shape and spectral characteristics using structured light,” Opt. Express 26(18), 23366–23379 (2018). [CrossRef]
3. Y. Yin, M. Wang, B. Z. Gao, X. Liu, and X. Peng, “Fringe projection 3D microscopy with the general imaging model,” Opt. Express 23(5), 6846–6857 (2015). [CrossRef]
4. S. Zhang, “Recent progresses on real-time 3D shape measurement using digital fringe projection techniques,” Opt. Laser Eng. 48(2), 149–158 (2010). [CrossRef]
5. D. Malacara, Optical Shop Testing3rd ed (John Wiley & Sons, 2007).
6. M. Takeda and K. Mutoh, “Fourier transform profilometry for the automatic measurement of 3-D object shapes,” Appl. Opt. 22(24), 3977–3982 (1983). [CrossRef]
7. Q. Kemao, “Windowed Fourier transform for fringe pattern analysis,” Appl. Opt. 43(13), 2695–2702 (2004). [CrossRef]
8. Y. Rivenson, Y. Zhang, H. Günaydın, D. Teng, and A. Ozcan, “Phase recovery and holographic image reconstruction using deep learning in neural networks,” Light: Sci. Appl. 7(2), 17141 (2018). [CrossRef]
9. J. Dai, Y. An, and S. Zhang, “Absolute three-dimensional shape measurement with a known object,” Opt. Express 25(9), 10384–10396 (2017). [CrossRef]
10. C. Zuo, L. Huang, M. Zhang, Q. Chen, and A. Asundi, “Temporal phase unwrapping algorithms for fringe projection profilometry: A comparative review,” Opt. Laser Eng. 85, 84–103 (2016). [CrossRef]
11. Y. An, J. S. Hyun, and S. Zhang, “Pixel-wise absolute phase unwrapping using geometric constraints of structured light system,” Opt. Express 24(16), 18445–18459 (2016). [CrossRef]
12. J. S. Hyun and S. Zhang, “Enhanced two-frequency phase-shifting method,” Appl. Opt. 55(16), 4395–4401 (2016). [CrossRef]
13. B. Li, Y. An, and S. Zhang, “Single-shot absolute 3D shape measurement with Fourier transform profilometry,” Appl. Opt. 55(19), 5219–5225 (2016). [CrossRef]
14. B. Li, Z. Liu, and S. Zhang, “Motion-induced error reduction by combining Fourier transform profilometry with phase-shifting profilometry,” Opt. Express 24(20), 23289–23303 (2016). [CrossRef]
15. C. Jiang, B. Li, and S. Zhang, “Pixel-by-pixel absolute phase retrieval using three phase-shifted fringe patterns without markers,” Opt. Laser Eng. 91, 232–241 (2017). [CrossRef]
16. Y. An and S. Zhang, “Pixel-by-pixel absolute phase retrieval assisted by an additional three-dimensional scanner,” Appl. Opt. 58(8), 2033–2041 (2019). [CrossRef]
17. S. Lv, Q. Sun, J. Yang, Y. Jiang, F. Qu, and J. Wang, “An improved phase-coding method for absolute phase retrieval based on the path-following algorithm,” Opt. Laser Eng. 122, 65–73 (2019). [CrossRef]
18. S. Zhang and P. S. Huang, “Novel method for structured light system calibration,” Opt. Eng. 45(8), 083601 (2006). [CrossRef]
19. B. Li, N. Karpinsky, and S. Zhang, “Novel calibration method for structured-light system with an out-of-focus projector,” Appl. Opt. 53(16), 3415–3426 (2014). [CrossRef]