Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Reference image based phase unwrapping framework for a structured light system

Open Access Open Access

Abstract

A novel real-time full-field phase unwrapping framework is proposed for the one-projector and one-camera structured light system. In this framework, only four patterns (including three fringe patterns and a binary speckle pattern) are required to measure the absolute 3D shape of the targets. We use the structured light system to capture four images of a nearly planar target (e.g. wall), of which the speckle image is taken as the reference image, and the corresponding absolute phase map is computed and stored, before measuring. So each pixel in the reference image can be mapped to an absolute phase value. In this way, if we can create the correspondences between the current and the reference speckle images in the process of measurement by using a matching algorithm, we can directly map the absolute values for the pixels of the current image. The mapped absolute phases can be used to determine the period of the relative phases. The experimental results verified the effectiveness and efficiency of the proposed framework. On a consumer-grade GPU (Nvidia GTX1060), our method can run at 187 fps.

© 2018 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

1. Introduction

Three-dimensional (3D) shape measurement has wide applications in the fields of product inspection, reverse engineering, 3D printing, human body measurement, human–machine interaction, animation production, and so on [1–3]. Among the existing 3D shape measurement methods, phase-shifting methods have advantages in measurement accuracy [4].

A phase-shifting based system usually requires one projector and one camera or two cameras. To measure fast, the minimum number of patterns to be projected is expected. The dual-camera system can achieve fast absolute 3D shape measurement using only three patterns with embedded speckles [5,6]. However, such a system requires an extra camera, which increases hardware cost. Moreover, the dual-camera system creates more shadow related problems because all three devices must see the point to measure the point. Therefore, using a single projector and a single camera for fast shape measurement is desirable. Spatial phase unwrapping methods [7,8] recover absolute phases only by using three phase-shifted fringe patterns. However, these methods only work if the surface geometry is smooth. It is very challenging to use spatial phase unwrapping methods if one wants to measure targets with abrupt depth changes, or absolute shape of multiple isolated targets. Temporal phase unwrapping methods can overcome the shortcomings of the spatial phase unwrapping methods [9–11]. However, the temporal phase unwrapping methods require more additional patterns to be projected, so they are at the cost of measurement speed.

In order to improve the measurement speed, the number of the additional patterns has to be reduced. Zhang [12] uses an additional stair image for phase unwrapping where the stair changes according to the2πdiscontinuities. Zuo et al. [13] uses four patterns to retrieve the absolute phase in the presence of surface discontinuities and isolated objects. Two phase maps can be obtained simultaneously by using the four fringe images: one is the wrapped phase, another one is called base phase which is used to assist phase unwrapping. A now popular digital light processing (DLP) Lightcrafter 4500 can switch binary patterns at 4225 Hz, or 8-bit grayscale patterns at 120 Hz. With this projector, Hyun el al [14]. use five binary patterns to perform 3D shape measurement at 667 Hz where the three dense binary dithered patterns are used to compute the wrapped phase and the average intensity is combined with two additional binary patterns to determine fringe order.

Kinect is a well-known consumer-grade 3D sensor with a 3D imaging module that consists of a NIR speckle projector and a CMOS camera. It can output depth images with the frame rate of 30 fps. The core of its 3D imaging algorithm is the reference image based depth mapping [15–17]. The similar technique is also applied in Intel RealSense and iPhone X. Inspired by the reference image based depth mapping technique, we developed a phase unwrapping framework for the structured light system based on phase shifting.

The main technique contribution in this work is: A novel real-time full-field phase unwrapping framework is proposed for the structured light system using a single projector and a single camera. In our framework, only four patterns (including three fringe patterns and a binary speckle pattern) are required to measure the absolute 3D shape of the targets. Before the measurement, we use the structured light system to capture four images of a nearly planar target (e.g. wall), of which the speckle image is taken as the reference image, and the corresponding relative and absolute phase maps are computed and stored. So each pixel in the reference image can be mapped to an absolute phase value. In this way, if we can create the correspondences between the current and reference speckle images in the process of measurement using a matching algorithm, we can directly map the absolute values for the pixels of the current image. This idea is illustrated in Fig. 1. Since the accuracy of the speckle image matching is relatively low, we do not use the mapped absolute phase directly to compute the shape of the object, but using them for phase unwrapping, that is, to determine the period of the relative phase. In addition, we have released the source code for this work publicly, as we show in Code 1 (Ref [18].).

 figure: Fig. 1

Fig. 1 Reference image based absolute phase mapping.

Download Full Size | PDF

What most related to us is the work of An and Zhang [19]. They firstly use an additional binary random pattern to establish the correspondences between the camera image and the projector image. Then, the obtained rough disparity map is refined by using cubic spline interpolation and phase constraints. However, our method is obviously different from their method. As they state, a standard stereo system with two cameras has the same sensor size, the same optics, and the same pixel size. However, the structured light system consisting of a projector and a camera does not conform to those assumptions. First, the sensor parameters (including sensor sizes, physical pixel sizes and system lenses) are different. Secondly, the camera and the projector have different image generation mechanisms. These differences make it difficult to match the projector image and the camera image. In the proposed framework in this work, we only need to match the images captured by the same camera which is relatively easy to solve using the existing techniques. Compared with [19], since our method does not require a lot of post-processing such as interpolation, it is more direct and easier to implement.

In the rest of the paper, Section 2 reviews the basic principle of phase shifting method. Section 3 presents the proposed phase unwrapping framework in detail. Section 4 shows experimental results. Finally, Section 5 concludes the paper.

2. Three step phase-shifting algorithm

Phase-shifting is a well-known fringe projection method for 3D surface measurement. A group of phase-shifted patterns are projected on the surface of the targets, and the phase is calculated at each pixel. Generally the minimum number of patterns is three, but more patterns will enhance the accuracy of the reconstructed phase. For fast-speed applications, a three-step phase-shift algorithm is desirable. For the classical three-step phase-shifting algorithm, three fringe patterns can be described as [4]

I1(x,y)=I(x,y)+I(x,y)cos(ϕ(x,y)2π/3)I2(x,y)=I(x,y)+I(x,y)cos(ϕ(x,y))I3(x,y)=I(x,y)+I(x,y)cos((ϕ(x,y)+2π/3)
where Idenotes the average intensity, Iis the intensity modulation, and ϕrepresents the phase. The phase ϕ corresponds to projector coordinates computed as
ϕ=xpw2πN
where xp is the projector x coordinate, N is the number of the fringes, and w is the horizontal resolution of the projection pattern. This means that if phase ϕ is known, xpcan be calculated immediately. Subsequently, the depth d can be calculated using point-surface triangulation between camera and projector.

Solving Eq. (1) we can only obtain the wrapped phase or the relative phase ϕ(0,2π)as follows

ϕ(x,y)=arctan(3(I1I3)2I2I1I3)

Then, the absolute phaseϕto be recovered in Eq. (1) can be written as

ϕ(x,y)=ϕ(x,y)+2kπ
where k is the fringe order (k=0,1,2,,N1).

The fringe order k in Eq. (4) cannot be determined with the three fringe patterns. In the next subsection, we will present our novel framework for determination of k.

3. Proposed framework

3.1 Framework pipeline

In this work, we proposed a novel computation framework for 3D shape measurement system using a single projector and a single camera. The three fringe patterns are used to compute the wrapped phase map, and the binary speckle pattern is used for unwrapping. The feature of this framework is the phase unwrapping based on the reference speckle image. The pipeline of the framework is shown in Fig. 2.

 figure: Fig. 2

Fig. 2 Reference image based unwrapping framework.

Download Full Size | PDF

Considering that the phase unwrapping of the planar target is relatively simple, we use the structured light system to scan a nearly planar target (e.g. wall) to acquire the reference speckle image and the corresponding relative and absolute phase maps before the start of the measurement. So for each pixel in the reference speckle image a relative phase and an absolute phase are assigned. The obtained speckle image and the corresponding relative and absolute phase maps are stored and will be used for phase unwrapping. Note that the reference image only needs to be obtained once for a calibrated structured light system.

To unwrap the relative phase computed from three fringe images in the process of measurement, the current speckle image is matched to the reference speckle image. Since the corresponding points should have the similar relative phases, the relative phase information is also used for speckle image matching. Once the dense correspondences are established, the rough absolute phases ϕrough(x,y)can be mapped for the pixels in the current frame, as illustrated in Fig. 1. Because the accuracy of the speckle image matching is relatively low, we do not useϕrough(x,y)to recover the shape of the objects but to determine the fringe order in Eq. (4).

k=round(ϕrough(x,y)ϕ(x,y)2π)
where round() is the rounding function.

The rest of this section will detail each of these major components, including the preparation of the reference image and the speckle image matching algorithm.

3.2 Reference image preparation

As mentioned in subsection 3.1, we have to calculate the relative and absolute phases for each pixel in the reference speckle image. The relative phases can be obtained by using Eq. (3) directly. The key for the reference image preparation is to obtain the absolute phases for the nearly planar target, which is relatively easy to solve compared with the general targets. We use the method of Zhang et al. [2] for obtaining the absolute phases. An additional pattern with a white line in the center is projected for the phase unwrapping. Once the absolute phases of the pixels in the center are determined, other pixels can be determined accordingly.

We generated a binary speckle pattern starting from a black low resolution pattern according to the following rules [20]:

  • (1) Speckle size: each white dot has a regular size of P × P pixels, where P is determined by the resolution of projector and camera used (in this work, P = 2). For the low resolution pattern, P = 1.
  • (2) Speckle density: in each equivalent area to 3 × 3 dots, only one dot is white.
  • (3) Speckle distribution: the white dots are randomly scattered, and no two dots are adjacent in an 8-neighborhood.

After the low resolution binary pattern is generated, we then upsample it by P times through nearest neighbor interpolation to obtain the full resolution speckle pattern. Part of the generated speckle pattern is shown in Fig. 3.

 figure: Fig. 3

Fig. 3 Binary speckle pattern.

Download Full Size | PDF

3.3 Speckle image matching

3.3.1 Image rectification

In traditional stereo matching, the stereo pair with the calibrated parameters is rectified to obtain row-aligned epipolar geometry to speed up the matching process [21,22]. In our phase unwrapping framework, we have to establish the correspondences between the current image and the reference image. Similarly, we hope to search the corresponding points on a horizontal line in the speckle image matching.

The geometry of the structured light system is illustrated in Fig. 4. If the baseline is parallel to the X axis of the camera, the image points of the 3D space points with different depths on the ray from the projector centeropto a pixel p in the projector plane lie on the same horizontal line, as illustrated in Fig. 4. So in our framework, each camera image is rectified before any computation so that the new X axis of the camera is parallel to the baseline. Next, we discuss how to rectify the camera images.

 figure: Fig. 4

Fig. 4 Geometry of the structured light system.

Download Full Size | PDF

According to the previous comments, we take:

The new X axis parallel to the baseline:

r1=(ocop)/ocop

The new Y axis orthogonal to X and to r¯3:

r2=r¯3×r1
where r¯3 is the old Z axis.

The new Z axis orthogonal to XY:

r3=r1×r2

Then, the new rotation matrix of the camera is

R=(r1Tr2Tr3T)

In next subsection, we will describe our speckle image matching method by using the rectified images detailly.

3.3.2 Speckle image matching

The problem of the speckle image matching in our framework is similar to the traditional stereo matching. It aims to establish dense and reliable correspondences between two images where the corresponding points lie on the same line. There are plenty of algorithms for this task for the natural images [23], including local and global methods. In this paper, we have designed a real-time speckle image matching algorithm based on PatchMatch [24] by combining the relative phase information. The original PatchMatch method for stereo matching optimizes the 3D disparity planes randomly [24] which makes it independent of the range of disparities. In recent years, this method has been widely used in real-time systems by optimizing the one-dimensional disparities [25,26].

The selection of the cost function is important for the stereo matching. The census transform [27] is a binary mapping, which encodes local image structures with relative orderings of the pixel intensities other than the intensity values themselves, and therefore tolerates outliers due to radiometric changes and image noise. It shows good performance in the evaluation by Hirschmuller and Scharstein [28]. We use a 7 × 9 window to encode each pixel’s local structure in a 64-bit string Census. The dissimilarity measure of a pixel p and its candidate correspondence q is defined as the Hamming distance of their bit strings Census(p) and Census(q).

The original census transform for a pixel p is calculated as follows:

Census(p,i)={1I(p,i)>I(p)0I(p,i)I(p)
where Census(p,i)is the i-th binary value of the bit string Census(p), I(p) represents the gray value of the pixel p, and I(p, i) is the i-th pixel of the window centered on the pixel p.

We find that it is better to compare the gray values of the pixels in the windows with the mean gray value of the window instead of the gray of the center pixel. So we modify the Eq. (10) to

Census(p,i)={1I(p,i)>u(p)0I(p,i)u(p)
Where, u(p)is the mean gray value of the window centered on the pixel p.

As stated in subsection 3.1, the corresponding points have the similar relative phases, so the relative phase information is also used for speckle image matching. If pcand prare a pair of corresponding points, they should satisfy the following phase constraint:

|ϕ(pc)ϕ(pr)|<Tϕ

Once every patch in the reference speckle image and the current speckle image is mapped to binary representation using Eq. (11), we use a variant of the PatchMatch Stereo framework to compute the dense correspondences. The main steps of this framework consist of initialization, propagation and post-processing.

  • (1) Initialization

    In this step, five random disparities are sampled per pixel and their matching cost is evaluated in parallel. The candidate with the minimal Hamming distance is kept.

  • (2) Propagation

    In total we run propagations in four directions: left to right, top to bottom, right to left, and bottom to top. For each propagation direction, we denote ppreas the previous pixel of the current pixel p. The disparity of the current pixel p is recalculated using the phase constraint in Eq. (12) within the range of [dpre-1, dpre + 1], where dpre is the disparity of ppre. If the minimal matching cost is lower than the cost of the current cost of p, we update the disparity of p with the best disparity in [dpre-1, dpre + 1].

  • (3) Post-processing

    In our post-processing step, we invalidate disparities which are associated with large Hamming distances (bigger than a predefined thresholdTc), and we run connected components followed by a minimum region check to remove outliers. Then, we run the propagations again only for the pixels without valid disparities and stop the propagation if the cost of the best disparity is bigger than Tc. Finally, a median filter is run on 3 × 3 patches to further reduce noise while preserving edges.

4. Experiments

We developed a structured light system to verify the performance of the proposed framework. The system includes a CCD camera and a DLP projector with a resolution of 1024 × 768. The baseline length is 207.27 mm and the distance from the camera to the reference plane is 127 cm. The camera is attached a lens with a focal length of 8 mm (Computar M0814-MP2). The camera has a resolution of 640 × 480 with a maximum frame rate of 160 frames per second (fps). The test program was implemented on a computer with Intel (R) Core(TM) i5-2500HQ CPU (3.3 GHz). In order to realize real time measurement, we implemented the variant of PatchMatch stereo framework in parallel on a Nvidia GPU (GTX1060). The initialization takes 0.94 ms, the propagation process requires 3.93 ms, and the post-processing takes 0.46 ms. The total time of the variant of PatchMatch stereo framework is 5.33 ms (187 Hz).

Unlike the dual-camera system, the gamma correction of the projector is important for the single-camera system. In this paper, we used the look-up table method in [29] for the correction of the gamma nonlinearity of the projector. In all experiments, we set the size of the matching window W to 7 × 9, Tϕ=0.35, and Tc = 20.

We first measured a David statue of 30 cm high with a complex surface geometry. The object speckle image, the reference speckle image and the corresponding reference absolute phase map are shown in Fig. 1. Figure 5 shows the main process of the proposed framework. Figure 5(a) is the disparity map after random initialization. Figure 5(b) shows the disparity map after propagation. Figure 5 (c) is the disparity map after post-processing. It shows that the high quality disparity map can be obtained with the matching algorithm in subsection 3.3. Figure 5(d) and (e) show the relative phase map and the unwrapped absolute phase map of the target, respectively. Figure 5(f) shows the 3D reconstruction result of the statue.

 figure: Fig. 5

Fig. 5 Results of David plaster statue. (a) Disparity map after random initialization. (b) Disparity map after propagation. (c) Disparity map after post-processing. (d) The relative phase map. (e) The absolute phase map. (f) 3D reconstruction result.

Download Full Size | PDF

Secondly, to verify the performance of our framework on multiple isolated targets, we used the structured light system to measure two separate statures at different distances. One of them is 30 cm high, 85 cm away from the camera, and the other is 15 cm high, 120 cm away from the camera, as shown in Fig. 6. Figure 6(a) shows the speckle image, (b) shows the obtained disparity map, (c) is the relative phase map, (d) is the unwrapped absolute phase map, and (e) shows the 3D reconstruction of the two statues.

 figure: Fig. 6

Fig. 6 Results of two isolated objects. (a) Speckle image. (b) Disparity map. (c) The relative phase map. (d) The final absolute phase map. (e) 3D reconstruction result.

Download Full Size | PDF

Furthermore, a colorful cloth was used to verity the effectiveness of the proposed framework for the objects with complex texture, as shown in Fig. 7. Figure 7 (a) is the gray image of the colorful cloth, (b) is the speckle image, (c) is the disparity map, (d) shows the relative phase map, (e) shows the unwrapped absolute phase map, and (e) shows the 3D reconstruction of the colorful cloth. The results show that the proposed framework is also effective for the objects with complex texture.

 figure: Fig. 7

Fig. 7 Results of colorful cloth. (a) Gray image. (b) Speckle image. (c) Dense disparity map. (d) The relative phase map. (e) The final absolute phase map. (f) 3D reconstruction result.

Download Full Size | PDF

Finally, we analyze the influence of the matching error to the unwrapping error. We first assume the relative phase is accurate. In the Fig. 8, the red, green and thick black lines denote the accurate absolute phase, the rough absolute phase and the relative phase respectively. The fringe order k can be calculated correctly as long as the error of estimated rough absolute phase is within (-π, π). In our experiment, the width of the projected fringe is 33 pixels. In other words, the allowable matching error range is (−16.5 pixels, 16.5 pixels) with respect to the pattern of the projector. Therefore, the small matching error (e.g. 1 or 2 pixels) will not affect the result of the phase unwrapping. So the sub-pixel refinement in the traditional stereo matching [22] is not needed in our matching procedure. However, large matching error will lead to wrong absolute phase.

 figure: Fig. 8

Fig. 8 Relationship among the rough absolute phase, accurate absolute phase and relative phase.

Download Full Size | PDF

5. Conclusions

This paper presented a novel phase unwrapping framework. In this framework, only four patterns are required to measure the absolute 3D shape of the targets. Considering that the phase unwrapping of the planar target is fairly simple, we use the structured light system to capture four images of a nearly planar target before starting the measurement, of which the speckle image is taken as the reference image, and the corresponding relative and absolute phase maps are computed. Each pixel in the reference image can be mapped to an absolute phase value. Once we have created the correspondences between the current and the reference speckle images using the speckle image matching algorithm, we can directly map the absolute values for the pixel of the current image. Since the accuracy of the speckle image matching is relatively low, we do not use the mapped absolute phase directly to compute the shape of the object, but we use them to determine the fringe order. The experimental results verified the effectiveness of the proposed framework for the targets with complex geometry, multiple isolated targets with different distances to the camera and colorful objects with complex texture. Moreover, we have implemented the speckle image matching algorithm on GPU for real time measurement. For a calibrated structured light system, the reference image only needs to be obtained once.

The core of our unwrapping framework is the speckle image matching which is also the core of the widely used 3D sensors like Kinect and RealSense, so we believe the proposed unwrapping framework is practicable. In the future, we will study the high speed 3D shape measurement method by combining the proposed phase unwrapping framework and the dithering techniques [6].

Funding

National Natural Science Foundation of China (NSFC) (61402489); Opening Project of Jiangsu Key Laboratory of Advanced Numerical Control Technology (KXJ201608).

References

1. F. Blais, “Review of 20 years of range sensor development,” J. Electron. Imaging 13(1), 231–243 (2004). [CrossRef]  

2. S. Zhang and P. S. Huang, “Novel method for structured light system calibration,” Opt. Eng. 45(8), 083601 (2006). [CrossRef]  

3. J. Shotton, R. Girshick, A. Fitzgibbon, T. Sharp, M. Cook, M. Finocchio, R. Moore, P. Kohli, A. Criminisi, A. Kipman, and A. Blake, “Efficient human pose estimation from single depth images,” IEEE Trans. Pattern Anal. Mach. Intell. 35(12), 2821–2840 (2013). [CrossRef]   [PubMed]  

4. S. Zhang, “Recent progresses on real-time 3D shape measurement using digital fringe projection techniques,” Opt. Lasers Eng. 48(2), 149–158 (2010). [CrossRef]  

5. W. Lohry, V. Chen, and S. Zhang, “Absolute three-dimensional shape measurement using coded fringe patterns without phase unwrapping or projector calibration,” Opt. Express 22(2), 1287–1301 (2014). [CrossRef]   [PubMed]  

6. W. Lohry and S. Zhang, “High-speed absolute three-dimensional shape measurement using three binary dithered patterns,” Opt. Express 22(22), 26752–26762 (2014). [CrossRef]   [PubMed]  

7. S. Zhang and S. T. Yau, “High-resolution, real-time 3-d absolute coordinate measurement based on a phase-shifting method,” Opt. Express 14(7), 2644–2649 (2006). [CrossRef]   [PubMed]  

8. H. Cui, W. Liao, N. Dai, and X. Cheng, “A flexible phase-shifting method with absolute phase marker retrieval,” Measurement 45(1), 101–108 (2012). [CrossRef]  

9. G. Sansoni, M. Carocci, and R. Rodella, “Three-dimensional vision based on a combination of gray-code and phase-shift light projection: analysis and compensation of the systematic errors,” Appl. Opt. 38(31), 6565–6573 (1999). [CrossRef]   [PubMed]  

10. Q. Zhang, X. Su, L. Xiang, and X. Sun, “3-D shape measurement based on complementary Gray-code light,” Opt. Lasers Eng. 50(4), 574–579 (2012). [CrossRef]  

11. D. Zheng and F. Da, “Phase coding method for absolute phase retrieval with a large number of codewords,” Opt. Express 20(22), 24139–24150 (2012). [CrossRef]   [PubMed]  

12. S. Zhang, “Composite phase-shifting algorithm for absolute phase measurement,” Opt. Lasers Eng. 50(11), 1538–1541 (2012). [CrossRef]  

13. C. Zuo, Q. Chen, G. Gu, S. Feng, and F. Feng, “High-speed three-dimensional profilometry for multiple objects with complex shapes,” Opt. Express 20(17), 19493–19510 (2012). [CrossRef]   [PubMed]  

14. J. S. Hyun and S. Zhang, “Superfast 3D absolute shape measurement using five binary patterns,” Opt. Lasers Eng. 90, 217–224 (2017). [CrossRef]  

15. B. Freedman, A. Shpunt, and M. Machline, “Depth mapping using projected patterns,” U.S. Patent 8,150,142. 2012–4-3.

16. G. Wang, X. Yin, X. Pei, and C. Shi, “Depth estimation for speckle projection system using progressive reliable points growing matching,” Appl. Opt. 52(3), 516–524 (2013). [CrossRef]   [PubMed]  

17. X. Yin, G. Wang, C. Shi, and Q. Liao, “Efficient active depth sensing by laser speckle projection system,” Opt. Eng. 53(1), 013105 (2014). [CrossRef]  

18. W. Bao, X. Xiao, Y. Xu, and X. Zhang, “Phase Unwrapping Source Code,” github (2018), [retrieved 4 Apr 2018], https://github.com/YuhuaXu/PhaseUnwrapping.

19. Y. An and S. Zhang, “Three-dimensional absolute shape measurement by combining binary statistical pattern matching with phase-shifting methods,” Appl. Opt. 56(19), 5418–5426 (2017). [CrossRef]   [PubMed]  

20. Y. Zhang, Z. Xiong, Z. Yang, and F. Wu, “Real-time scalable depth sensing with hybrid structured light illumination,” IEEE Trans. Image Process. 23(1), 97–109 (2014). [CrossRef]   [PubMed]  

21. A. Fusiello, E. Trucco, and A. Verri, “A compact algorithm for rectification of stereo pairs,” Mach. Vis. Appl. 12(1), 16–22 (2000). [CrossRef]  

22. D. Scharstein and R. Szeliski, “A taxonomy and evaluation of dense two-frame stereo correspondence algorithms,” Int. J. Comput. Vis. 47(1–3), 7–42 (2002). [CrossRef]  

23. D. Scharstein, R. Szeliski, and H. Hirschmüller, “Middlebury Stereo Vision,” http://vision.middlebury.edu/stereo.

24. M. Bleyer, C. Rhemann, and C. Rother, “PatchMatch stereo - stereo matching with slanted support windows,” in Proceedings of the British Machine Vision Conference (BMVC, 2011), pp. 1–11.

25. S. R. Fanello, J. Valentin, C. Rhemann, A. Kowdle, V. Tankovich, P. Davidson, and S. Izadi, “UltraStereo: Efficient learning-based matching for active stereo systems,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 2017), pp. 6535–6544. [CrossRef]  

26. M. Zollhöfer, C. Theobalt, M. Stamminger, M. Nießner, S. Izadi, C. Rehmann, C. Zach, M. Fisher, C. Wu, A. Fitzgibbon, and C. Loop, “Real-time non-rigid reconstruction using an RGB-D camera,” in Proceedings of ACM Transactions on Graphics (ACM 2014), pp. 1–12. [CrossRef]  

27. R. Zabih and J. Woodfill, “Non-parametric local transforms for computing visual correspondence,” in Proceedings of European Conference on Computer Vision (Springer, 1994), pp. 151–158. [CrossRef]  

28. H. Hirschmüller and D. Scharstein, “Evaluation of cost functions for stereo matching,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 2007), pp. 1–8. [CrossRef]  

29. S. Zhang and S. T. Yau, “Generic nonsinusoidal phase error correction for three-dimensional shape measurement using a digital video projector,” Appl. Opt. 46(1), 36–43 (2007). [CrossRef]   [PubMed]  

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (8)

Fig. 1
Fig. 1 Reference image based absolute phase mapping.
Fig. 2
Fig. 2 Reference image based unwrapping framework.
Fig. 3
Fig. 3 Binary speckle pattern.
Fig. 4
Fig. 4 Geometry of the structured light system.
Fig. 5
Fig. 5 Results of David plaster statue. (a) Disparity map after random initialization. (b) Disparity map after propagation. (c) Disparity map after post-processing. (d) The relative phase map. (e) The absolute phase map. (f) 3D reconstruction result.
Fig. 6
Fig. 6 Results of two isolated objects. (a) Speckle image. (b) Disparity map. (c) The relative phase map. (d) The final absolute phase map. (e) 3D reconstruction result.
Fig. 7
Fig. 7 Results of colorful cloth. (a) Gray image. (b) Speckle image. (c) Dense disparity map. (d) The relative phase map. (e) The final absolute phase map. (f) 3D reconstruction result.
Fig. 8
Fig. 8 Relationship among the rough absolute phase, accurate absolute phase and relative phase.

Equations (12)

Equations on this page are rendered with MathJax. Learn more.

I 1 (x,y)= I (x,y)+ I (x,y)cos(ϕ(x,y)2π/3) I 2 (x,y)= I (x,y)+ I (x,y)cos(ϕ(x,y)) I 3 (x,y)= I (x,y)+ I (x,y)cos((ϕ(x,y)+2π/3)
ϕ= x p w 2πN
ϕ (x,y)=arctan( 3 ( I 1 I 3 ) 2 I 2 I 1 I 3 )
ϕ(x,y)= ϕ (x,y)+2kπ
k=round( ϕ rough (x,y) ϕ (x,y) 2π )
r 1 = ( o c o p )/ o c o p
r 2 = r ¯ 3 × r 1
r 3 = r 1 × r 2
R=( r 1 T r 2 T r 3 T )
Census(p,i)={ 1I(p,i)>I(p) 0I(p,i)I(p)
Census(p,i)={ 1I(p,i)>u(p) 0I(p,i)u(p)
| ϕ ( p c ) ϕ ( p r ) |< T ϕ
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.