Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Hyper thin 3D edge measurement of honeycomb core structures based on the triangular camera-projector layout & phase-based stereo matching

Open Access Open Access

Abstract

We propose a novel hyper thin 3D edge measurement technique to measure the profile of 3D outer envelope of honeycomb core structures. The width of the edges of the honeycomb core is less than 0.1 mm. We introduce a triangular layout design consisting of two cameras and one projector to measure hyper thin 3D edges and eliminate data interference from the walls. A phase-shifting algorithm and the multi-frequency heterodyne phase-unwrapping principle are applied for phase retrievals on edges. A new stereo matching method based on phase mapping and epipolar constraint is presented to solve correspondence searching on the edges and remove false matches resulting in 3D outliers. Experimental results demonstrate the effectiveness of the proposed method for measuring the 3D profile of honeycomb core structures.

© 2016 Optical Society of America

1. Introduction

Three-dimensional (3D) outer envelope profile measurement of the honeycomb core structures has received much attention because of their wide applications in various fields, such as the aerospace, automotive, marine, and construction industries, all of which require lightweight structures [1]. The honeycomb core structure is an array of hollow cells formed between thin vertical walls as shown in Fig. 1 . The outer envelope surface refers to the surface of the upper edges of vertical walls. The width of the edges to be profiled is less than 0.1 mm that are called hyper thin edges.

 figure: Fig. 1

Fig. 1 Honeycomb core: (a) structure and (b) local details of edges and walls. Edges are shown in white, and walls are shown in gray.

Download Full Size | PDF

To reconstruct the 3D data of hyper thin edges is a significant challenge in 3D measurement. Locating the position of hyper thin edges using a contact coordinate measuring machine (CMM) is difficult. The contact force of the probe will deform the edge surface of a paper-based honeycomb core. Thus, the 3D data of the hyper thin edges of a honeycomb core are difficult to acquire using CMM. Numerous image-based optical non-contact 3D shape measurement techniques [2] have been developed in recent years. However, most of these methods rely on image feature extraction based on regional template calculation. These methods are inappropriate for hyper thin edges in the image because such edges can only present one or two pixels, which result in difficulties of feature extraction.

Assuming that the 3D data of a honeycomb core can be obtained, extraction of the 3D data of hyper thin edges from the 3D profile data is another controversial issue. Existing extraction methods are divided into two categories: This first category is accomplished by applying 3D masks to detect the edges directly in 3D data as 3 × 3 × 3 Zucker–Hummel masks [3] or 3 × 3 × 3 cellular logic array [4]. Other extraction methods have been conducted by detecting the edges in 2D depth images transformed from 3D data, using the improved Canny’s edge detector algorithm [5, 6 ]. All these methods are feasible based on the premise that at least three 3D data points across the width direction of edge are obtained. However, the edges of the honeycomb core are very thin, and only one 3D data point across the width direction can be acquired. This characteristic results in failure of 3D edge extraction using current 3D edge detection methods.

We attempted to design a novel method that can measure the 3D data of hyper thin edges to realize outer envelope profile measurements. Moreover, the proposed method cannot obtain effectively the 3D data of the surface of the vertical walls, which avoids the extraction of hyper thin edges. Phase-shifting method with multi-frequency phase unwrapping is feasible for measuring the profile of hyper thin edges, in which phase retrievals can be realized point by point [7]. This technique has been very successful in measuring discontinuous surfaces and high-contrast 3D surfaces [8, 9 ].

In this paper, we introduce a hyper thin 3D edge measurement technique based on the triangular camera-projector layout and phase-based stereo matching to obtain 3D outer envelope profile of honeycomb core structures. Two cameras and one projector are designed to form a triangular layout, which can effectively obtain the 3D data of the hyper thin edges and does not acquire the 3D data of the surface of vertical walls. A new stereo matching method based on phase mapping and epipolar constraint is proposed to solve the correspondence searching on the edges and remove false matches resulting in 3D outliers.

The paper is organized as follows. Principles of the proposed method are explained in Section 2. Measurement system and experimental results are discussed in Section 3, and conclusions are presented in Section 4.

2. Principle

The 3D profile measurement system based on stereovision and phase-shifting method [10] is widely used in 3D shape measurements because of its fast measurement speed, less sensitivity to surface reflectivity variations, and high spatial resolution [11].

In the proposed method, two cameras and one projector are designed to form a triangular layout. Thus, 3D data of the hyper thin edges can be obtained, but 3D data of the surface of vertical walls are not measured. The hyper thin 3D edge measurement technique for honeycomb core structures comprises three components:

  • 1. Triangle design of system layout to acquire 3D data of hyper thin edges, not the data of the surface of the vertical walls.
  • 2. Phase retrial algorithm for hyper thin edge measurement that calculates the phases on the step edges.
  • 3. Phased-based stereo matching algorithm that finds precise correspondences between the two cameras and eliminates the 3D outliers on the edges.

2.1 Triangle design of system layout to measure hyper thin 3D edges

If an arbitrary point Q on the non-transparent surface as shown in Fig. 2(a) can be measured, the basic three conditions for reconstructing a point are as follows:

 figure: Fig. 2

Fig. 2 Three basic conditions for measurement: (a) traditional system layout; (b) vectors on honeycomb core.

Download Full Size | PDF

  • 1. Fringe from projector should project on the point Q to be measured;
  • 2. Left camera should be able to capture the fringe on point Q to be measured; and
  • 3. Right camera should be able to capture the fringe on point Q to be measured.

Assuming that the normal of point Q on the surface isNQ, then OL, OR, and OP indicate the optical centers of the left camera, right camera, and projector, respectively. Thus, QL, QR, and QPare the vectors, which indicate the incident light or reflected light directions from the test point Q to the left camera, right camera, and projector, respectively [Fig. 2(a)]. Then, when the influences caused by the shelters of surfaces are ignored, the above-mentioned three conditions can be expressed as follows:

{QLNQ>0QRNQ>0QPNQ>0

Vector dot products greater than zero indicate that the angles between the surface normal (NQ) with the directions of the incident light or reflected light directions (QL, QR, QP) are less than 90°.

For points on honeycomb core, E and W indicate the points on edges and walls respectively, with normal NEand NW. Similarly, EL, ER , and EP are the vectors of point E to the left camera, right camera, and projector, whereas WL, WR and WP are the vectors of point W to the left camera, right camera, and the projector as shown in Fig. 2(b).

The edges and the walls may be reconstructed at the same time when measuring the honeycomb core with the traditional 3D system shown in Fig. 3(a) , because the vectors on the edges and walls meet the conditions of Eq. (1).

 figure: Fig. 3

Fig. 3 Proposed camera-projector system layout for the 3D measurement of honeycomb core: (a) traditional system layout; (b) new layout of triangle geometry; and (c) maximum working range. The scope of B × H is the possible maximum working range on the work distance L.

Download Full Size | PDF

Therefore, if only the 3D profile of edges of honeycomb core can be measured, and that of vertical walls cannot be measured, the key rules for the design of the measurement system are as follows.

Rule I: All three conditions have to be satisfied on the surface of edges of honeycomb core, as follows:

{ELNE>0ERNE>0EPNE>0

Rule II: One of the above three basic measurement conditions does not have to be satisfied on the surface of the walls of honeycomb core, as follows:

WLNW<0orWRNW<0orWPNW<0

Rule I can be easily solved, as long as the honeycomb core is placed in front of the measurement system. Rule II is the key issue to avoid obtaining the 3D data of vertical walls. We can change the system layout and utilize the shelters of walls to realize Rule II.

The new system design is introduced to realize Rules I and II. The proposed layout of cameras and projector is designed in a triangular geometry as shown in Fig. 3(b), compared with the traditional system layout in Fig. 3(a), in which two cameras and one projector were placed almost in a straight line.

In Fig. 3, B indicates the length of baseline between the left and right cameras, and H is the distance of the optical center of projector to baseline. We simulate the valid measurement scope according to Rules I and II, in which the edges of honeycomb core can be measured, but the 3D data of vertical walls cannot be obtained. The possible maximum working range of new layout for measuring honeycomb core is B × H at the working distance L [Fig. 3(c)], in which Rule II can be realized because of the shelters of walls. In actual measurement, the measurement scope is always less than the possible maximum working range to ensure the effectiveness of measurement.

2.2 Phase retrial algorithm on hyper thin edges

The hyper thin edges are extremely step surfaces. Thus, phase-shifting algorithm and multi-frequency phase-unwrapping principle are applied for retrieve full-field phase on the edges.

A four-step phase-shifting algorithm is adopted to calculate fringe phase. The intensities of the captured four fringe images Ii (i = 0,1,2,3) are given by:

Ii(x,y)=IB(x,y)+IA(x,y)cos[Φ(x,y)+i×π/2]
where (x,y) is the pixel coordinate on the image plane of camera, IB(x,y) is the average intensity, and IA(x,y) is the modulation intensity. Fringe phase Φ(x,y) can be determined by:

Φ(x,y)=arctanI3(x,y)I1(x,y)I0(x,y)I2(x,y)

Equation (5) gives the phase values between 0 and 2π only. Thus, the multi-frequency heterodyne phase-unwrapping principle [12] is applied to remove 2π discontinuities and obtain a continuous phase map. This process is realized by projecting the fringe patterns with three different frequencies. This temporal unwrapping scheme can realize the full-field phase retrieval pixel by pixel, which is extremely important to hyper thin edge 3D measurement.

2.3 Phased-based stereo matching for hyper thin edges

Stereo matching is to explore the correspondences in the left and right cameras for the 3D points to be measured. The edges of the honeycomb core are very thin so they are shown as no more than two pixels in the image. Traditional phase-based stereo matching method [12,13 ] is difficult to use in stereo matching of hyper thin edges. Another key issue is the outliers in reconstructed 3D data, which originate from false matches on the edges and often occur in 3D optical measurement. The removal of these outliers should be investigated.

We propose two phase-matching methods in our previous work. One method is based on epipolar constraint [14], which reduced the stereo matching time. The other method needs to project a series of horizontal and vertical fringe patterns and estimate the corresponding coordinate based on phase mapping [15]. In this paper, the three-step framework for searching correspondence points on the edges of honeycomb core combines phase mapping and epipolar constraint.

Step 1: Fringe images are captured, and the phase of the image points is retrieved pixel by pixel

A series of sinusoidal horizontal and vertical fringe patterns are projected, and two cameras capture fringe images synchronously. The four-step phase-shifting technique and three-frequency heterodyne phase-unwrapping principle are used to calculate the vertical absolute phase maps. A point m(x,y)T in the image corresponds to two absolute unwrapped phase values, namely, φ calculated from vertical fringe images and ϕ calculated from horizontal fringe images.

Step 2: Correspondence searching is performed based on mapping the image pixels to their corresponding coordinates in the image coordinate system of the projector

First, all points in the images from the two cameras are mapped to the corresponding coordinates in the image coordinate system of the projector. Based on the absolute unwrapped phase values of image points, the corresponding coordinates in the image coordinate system of the projector are mapped by the following equations:

{xP=φ×λ/2πyP=ϕ×λ/2π
where λ is the pitch of the projected sinusoidal fringe patterns, and b(xP,yP)T are the corresponding coordinates of the image point in the image coordinate system of the projector.

Then, the coordinate of a point in the image coordinate system of the projector of the left camera is assumed to be mL(xL,yL)T, and its corresponding coordinate in the image coordinate system of the projector is bL(xP,yP)T (Fig. 4 ). Let the coordinates of the points in the image coordinate system of the right camera be {mR,j(xR,j, yR,j)T, j = 1,2,…,M} and their corresponding coordinates in image coordinate system of projector be {bR,j(xP,j,yP,j)T, j = 1,2,…,M}. M is the count of pixels with valid phases in the right camera. Then, for the point bL(xP, yP)T, the nearest point bR,C(xP,C, yP,C)T in {bR,j(xP,j,yP,j)T, j = 1,2,…,M} can be found as follows:

C=argminj=1,...,M[bL(xP,yP)TbR,j(xP,j,yP,j)T2<dMax].
where C is the serial number of {bR,j(xP,j, yP,j)T, j = 1,2,…,M}, and dMax is the maximum distance between bL and bR,C in the image coordinate system of the projector to ensure that the correspondence bR,C is close to point bL. Furthermore, let the coordinate mR,C(xR,C, yR,C)T of the right camera be the correspondence of the coordinate mL(xL, yL)T of the left camera. Thus, the correspondence coordinate mR,C(xR,C, yR,C)T can be found by mapping the relation with the same serial number C as bR,C(xP,C, yP,C)T (Fig. 4).

 figure: Fig. 4

Fig. 4 Correspondence searching procedure: (a) a point mL (shown in blue color) in image coordinate system of left camera; (b) mapped points in image coordinate system of projector, with bR,C (shown in red color) as the nearest point of bL (in blue color) mapped from mL; and (c) points in the image coordinate system of the right camera, and bR,C is mapped from mR,C.

Download Full Size | PDF

In the correspondence exploration of edge points on the image coordinate system of the projector, we use the nearest point principle, not the high-accuracy sub-pixel interpolation principle [15], because the latter cannot work on hyper thin edges with valid phases of one or two pixels.

Step 3: False matches are detected based on epipolar constraint and outliers from 3D data are removed

The intensity of the edge pixels on the fringe image are mixed by the fringes on the edge and under the edge [Fig. 5(a) ], because the pixel in the image corresponds to the 3D space area instead of an ideal point. Consequently, the phases calculated by the phase-shifting algorithm based on these mixed pixels will be inaccurate and lead to false matches, resulting in outliers in 3D data after reconstruction.

 figure: Fig. 5

Fig. 5 Pixel on edge and epipolar constraint: (a) mixed pixel of edge on the image plane, in which the intensity are mixed by fringes on and under the edge; and (b) theoretical model of epipolar constraint without camera distortion.

Download Full Size | PDF

False matches are detected by employing epipolar constraint, which yields the geometry constraint of homologous pairs based on the projective geometry between two views. A theoretical model without distortion is shown in Fig. 5(b). Il and Ir are the imaging planes of the left and right cameras, respectively. Cl and Cr are the corresponding optical centers of the two cameras, and their projections to the imaging planes Il and Ir are the epipoles of el and er, respectively. Point P on the edge projects on the left and right image planes as pl and pr points, respectively. Then, P, pl, pr, Cl, Cr, el, and er will be coplanar in the epipolar plane. Let mL(xL,yL)T be the image coordinate of pl , and the corresponding epipolar line be lpr(a,b,c)T. Suppose the image coordinates of corresponding point pc found by correspondence searching algorithm in Step 2 is mR(xR, yR)T, then the distance d from the corresponding point pc to epipolar line lpr meet the following equation:

d=|axR+byR+ca2+b2|<dT
where dT is the threshold to ensure that the matching points in the right camera is almost on the epipolar line, with unit in pixel. The distance d should be zero theoretically, but d is not zero because of the influence of various error sources. Therefore, the correspondences of mL and mR are regarded as false matches and are removed if they cannot satisfy Eq. (8).

In Eq. (8), the epipolar line lpr(a,b,c)T can be calculated by:

lpr=F[mL1]
where F is the fundamental matrix, which can be calculated from the calibration results of the left and right cameras [16].

If the nonlinear distortion of the cameras is considered, the nonlinear correction algorithm should be considered before detection of false matches [17].

3. Experiments

3.1 Experimental setup

The hyper thin 3D edge measurement system for honeycomb core structures consisted of two monochrome cameras (Basler piA2400) with a resolution of 2456 × 2058 pixels and one digital projector (custom-made) with a resolution of 800 × 600 pixels as shown in Fig. 6(a) . The projector used blue LED lighting with 465 nm wavelength to achieve the best image resolution. The baseline length of the two cameras was approximately 180 mm. The distance of the optical center of projector to baseline was about 150mm. The intrinsic and extrinsic parameters of the two cameras were pre-calibrated before measuments [15].

 figure: Fig. 6

Fig. 6 Hyper thin 3D edge measurement system for honeycomb core structures: (a) the developed system, including two cameras (yellow circled) and one projector (green circled) with triangular layout; (b) a workpiece of honeycomb core structure; (c) experimental setup.

Download Full Size | PDF

A workpiece of paper-based honeycomb core structure was chosen as a test sample [Fig. 6(b)], which could not be measured by the CMM. The experimental setup is shown in Fig. 6(c). The distance between the camera and the workpiece was 250 mm, and the valid field of view was around 140 mm (width) × 110 mm (height). The horizontal and vertical fringe pitches for the three different frequencies used in the heterodyne phase-unwrapping algorithm were 12, 13, and 14 pixels in the projector as shown in Fig. 7 . The maximum range of unwrapping using the heterodyne multi-frequency method was 1092 pixels in the projector, which was larger than the resolution of a digital projector.

 figure: Fig. 7

Fig. 7 Horizontal and vertical fringe patterns of blue light used for fringe projection: (a), (b), and (c) are vertical fringe patterns with pitches of 12, 13, and 14 pixels, respectively; (d), (e), and (f) are horizontal fringe patterns with pitches of 12, 13, and 14 pixels, respectively.

Download Full Size | PDF

3.2 Experimental results

First, the 3D measurement system with traditional camera-projector layout [Fig. 3(a)] was used to measure the test sample, and the result is shown in Fig. 8 . The 3D data of many edges could not be obtained effectively, as shown in Fig. 8(a). The acquired 3D data of edges were mixed with the 3D data of walls in Fig. 8(b), such that independent data of edges were difficult to extract.

 figure: Fig. 8

Fig. 8 3D measurement results of traditional phase-shifting method: (a) front and (b) stereo views. The mixed 3D data of edges and walls appear clearly in the circled area, and the details shown at the top right corner.

Download Full Size | PDF

Furthermore, the proposed hyper thin 3D edge measurement system was used to measure the test sample. The phase retrieval results shown in Fig. 9 were applied to phase-based stereo matching. The reconstructed 3D data is shown in Fig. 10 . The 3D data of most edges were measured effectively as shown, and the 3D data of walls were not obtained. The results show that further 3D edge extraction was not necessary, and the results could be used for subsequent machining error analysis.

 figure: Fig. 9

Fig. 9 Phase retrieval results: (a) vertical and (b) horizontal absolute phase maps of the left camera; (c) vertical and (d) horizontal absolute phase maps of the right camera.

Download Full Size | PDF

 figure: Fig. 10

Fig. 10 3D measurement results using hyper thin 3D edge measurement system with the false match detection algorithm: (a) front view and the details shown on the top right corner; and (b) stereo view and the details shown on the top right corner. The 3D data of edges appeared clearly, and no data of walls are obtained.

Download Full Size | PDF

Epipolar constraint was not used to determine false matches during stereo matching to demonstrate the effectiveness of the false matching detection algorithm. Consequently, numerous outliers resulting from false matches were obtained in the reconstructed 3D data shown in Fig. 11 . Compared with the 3D data shown in Fig. 10, the false matching detection algorithm based on epipolar constraint could effectively eliminate false matches and remove 3D outliers.

 figure: Fig. 11

Fig. 11 3D measurement results of using hyper thin 3D edge measurement system without the false matches detection algorithm: (a) front view of 3D data and local details of outliers shown on the top right corner; and (b) stereo view, in which the outliers are circled with red color.

Download Full Size | PDF

In order to verify the accuracy of the measurement system, we conducted measurement on the back edge of stainless steel knife with the edge width of approximately 0.4 mm [Fig. 12(a) ], which has a similar range of thickness of hyper thin 3D honeycomb core structures. The reconstructed 3D point-cloud of the edge was fitted as a standard plane. The distances between the 3D points and standard plane were calculated as plane measurement deviations used to evaluate the measurement accuracy. The reconstructed 3D points shown in Fig. 12(b) were used for plane fitting. The point-to-plane deviations are shown in Fig. 12(c) with the standard deviation of 0.108 mm and the maximum deviation of 0.236 mm. In the experiment, the high dynamic range fringe projection profilometry [9] was applied in profiling the edge surface of stainless steel knife avoiding the shiny steel surface to cause the image saturation.

 figure: Fig. 12

Fig. 12 3D measurement of the back edge of stainless steel knife: (a) the edge surface with blue fringes projected and the knife image shown on the top right corner; (b) reconstructed 3D point-cloud; and (c) point-to-plane deviations (standard deviation is 0.108 mm and maximum deviation of 0.236mm).

Download Full Size | PDF

The experimental results indicated that the hyper thin 3D edge measurement technique is able to effectively measure the 3D profile of honeycomb core structures compared with the traditional 3D measurement method. A false-matching detection algorithm based on epipolar constraints can eliminate outliers of the 3D data, which is beneficial for subsequent data analysis.

4. Conclusion

In this paper, a novel hyper thin 3D edge measurement technique is proposed to measure the 3D profile of honeycomb core structures. The triangular camera-projector layout is able to effectively yield the 3D data of the edges of the honeycomb core with no data interference from walls. Moreover, the phase-based stereo matching method by combining the phase mapping algorithm and the epipolar constraint can realize stereo matching at the edges and eliminate outliers from the 3D data. With the develped hyper thin 3D edge measurement system, the 3D outer envelop profile measurement of the honeycomb core structures has been achieved. The accuracy of the system with standard deviation of 0.108 mm was verified by measuring the edge of stainless steel knife using the proposed method.

Acknowledgments

This work is supported by National Natural Science Foundation of China (Grant No.61475013), the National High Technology Research and Development Program (863Program) (Grant No.2012AA041205), “High-end CNC machine tools and basic manufacturing equipment” special science and technology major (Grant No.2013ZX04001-011), Program for Changjiang Scholars and Innovative Research Team in University (IRT1203).

References and links

1. Y. M. Jen, F. L. Teng, and T. C. Teng, “Two-stage cumulative bending fatigue behavior for the adhesively bonded aluminum honeycomb sandwich panels,” Mater. Des. 54, 805–813 (2014). [CrossRef]  

2. F. Chen, G. M. Brown, and M. M. Song, “Overview of three-dimensional shape measurement using optical methods,” Opt. Eng. 39(1), 10–22 (2000). [CrossRef]  

3. R. D. da Silva, R. Minghim, and H. Pedrini, “3D edge detection based on Boolean functions and local operators,” Int. J. Image Graph. 15(1), 1550003 (2015). [CrossRef]  

4. G. R. Chandra, T. Sultana, G. Sathya, and E. G. Rajan, “3D edge detection of medical images using mathematical morphological and cellular logic array processing techniques,” in Proceedings of IEEE Conference on Systems, Man, and Cybernetics (IEEE, 2012), pp. 2156–2160. [CrossRef]  

5. A. Lejeune, S. Pierard, M. Van Droogenbroeck, and J. Verly, “A new jump edge detection method for 3D cameras,” in Proceedings of IEEE Conference on 3D Imaging (IEEE, 2011), pp. 1–7. [CrossRef]  

6. H. Schafer, F. Lenzen, and C. S. Garbe, “Depth and intensity based edge detection in time-of-flight images,” in Proceedings of IEEE Conference on 3D Vision (IEEE, 2013), pp. 111–118. [CrossRef]  

7. S. Zhang and S. T. Yau, “High dynamic range scanning technique,” Opt. Eng. 48(3), 033604 (2009). [CrossRef]  

8. D. Li and J. Kofman, “Adaptive fringe-pattern projection for image saturation avoidance in 3D surface-shape measurement,” Opt. Express 22(8), 9887–9901 (2014). [CrossRef]   [PubMed]  

9. H. Z. Jiang, H. J. Zhao, and X. D. Li, “High dynamic range fringe acquisition: A novel 3-D scanning technique for high-reflective surfaces,” Opt. Lasers Eng. 50(10), 1484–1493 (2012). [CrossRef]  

10. X. Han and P. Huang, “Combined stereovision and phase shifting method: a new approach for 3D shape measurement,” Proc. SPIE 7389, 73893C (2009). [CrossRef]  

11. W. Lohry, V. Chen, and S. Zhang, “Absolute three-dimensional shape measurement using coded fringe patterns without phase unwrapping or projector calibration,” Opt. Express 22(2), 1287–1301 (2014). [CrossRef]   [PubMed]  

12. C. Reich, R. Ritter, and J. Thesing, “3-D shape measurement of complex objects by combining photogrammetry and fringe projection,” Opt. Eng. 39(1), 224–231 (2000). [CrossRef]  

13. W. Lohry and S. Zhang, “High-speed absolute three-dimensional shape measurement using three binary dithered patterns,” Opt. Express 22(22), 26752–26762 (2014). [CrossRef]   [PubMed]  

14. D. Li, H. J. Zhao, and H. Z. Jiang, “Fast phase-based stereo matching method for 3D shape measurement,” in Proceedings of IEEE Symposium on Optomechatronic Technologies (IEEE, 2010), pp. 1–5 [CrossRef]  

15. H. J. Zhao, Z. Wang, H. Z. Jiang, Y. Xu, and C. Dong, “Calibration for stereo vision system based on phase matching and bundle adjustment algorithm,” Opt. Lasers Eng. 68(5), 203–213 (2015). [CrossRef]  

16. R. Hartley and A. Zisserman, Multiple View Geometry (Cambridge University Press, 2000), Ch. 9.

17. S. Ma, R. Zhu, C. Quan, L. Chen, C. J. Tay, and B. Li, “Flexible structured-light-based three-dimensional profile reconstruction method considering lens projection-imaging distortion,” Appl. Opt. 51(13), 2419–2428 (2012). [CrossRef]   [PubMed]  

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (12)

Fig. 1
Fig. 1 Honeycomb core: (a) structure and (b) local details of edges and walls. Edges are shown in white, and walls are shown in gray.
Fig. 2
Fig. 2 Three basic conditions for measurement: (a) traditional system layout; (b) vectors on honeycomb core.
Fig. 3
Fig. 3 Proposed camera-projector system layout for the 3D measurement of honeycomb core: (a) traditional system layout; (b) new layout of triangle geometry; and (c) maximum working range. The scope of B × H is the possible maximum working range on the work distance L.
Fig. 4
Fig. 4 Correspondence searching procedure: (a) a point mL (shown in blue color) in image coordinate system of left camera; (b) mapped points in image coordinate system of projector, with bR,C (shown in red color) as the nearest point of bL (in blue color) mapped from mL ; and (c) points in the image coordinate system of the right camera, and bR,C is mapped from mR,C .
Fig. 5
Fig. 5 Pixel on edge and epipolar constraint: (a) mixed pixel of edge on the image plane, in which the intensity are mixed by fringes on and under the edge; and (b) theoretical model of epipolar constraint without camera distortion.
Fig. 6
Fig. 6 Hyper thin 3D edge measurement system for honeycomb core structures: (a) the developed system, including two cameras (yellow circled) and one projector (green circled) with triangular layout; (b) a workpiece of honeycomb core structure; (c) experimental setup.
Fig. 7
Fig. 7 Horizontal and vertical fringe patterns of blue light used for fringe projection: (a), (b), and (c) are vertical fringe patterns with pitches of 12, 13, and 14 pixels, respectively; (d), (e), and (f) are horizontal fringe patterns with pitches of 12, 13, and 14 pixels, respectively.
Fig. 8
Fig. 8 3D measurement results of traditional phase-shifting method: (a) front and (b) stereo views. The mixed 3D data of edges and walls appear clearly in the circled area, and the details shown at the top right corner.
Fig. 9
Fig. 9 Phase retrieval results: (a) vertical and (b) horizontal absolute phase maps of the left camera; (c) vertical and (d) horizontal absolute phase maps of the right camera.
Fig. 10
Fig. 10 3D measurement results using hyper thin 3D edge measurement system with the false match detection algorithm: (a) front view and the details shown on the top right corner; and (b) stereo view and the details shown on the top right corner. The 3D data of edges appeared clearly, and no data of walls are obtained.
Fig. 11
Fig. 11 3D measurement results of using hyper thin 3D edge measurement system without the false matches detection algorithm: (a) front view of 3D data and local details of outliers shown on the top right corner; and (b) stereo view, in which the outliers are circled with red color.
Fig. 12
Fig. 12 3D measurement of the back edge of stainless steel knife: (a) the edge surface with blue fringes projected and the knife image shown on the top right corner; (b) reconstructed 3D point-cloud; and (c) point-to-plane deviations (standard deviation is 0.108 mm and maximum deviation of 0.236mm).

Equations (9)

Equations on this page are rendered with MathJax. Learn more.

{ QL N Q > 0 QR N Q > 0 QP N Q > 0
{ EL N E > 0 ER N E > 0 EP N E > 0
WL N W < 0 o r WR N W < 0 o r WP N W < 0
I i ( x , y ) = I B ( x , y ) + I A ( x , y ) cos [ Φ ( x , y ) + i × π / 2 ]
Φ ( x , y ) = arc tan I 3 ( x , y ) I 1 ( x , y ) I 0 ( x , y ) I 2 ( x , y )
{ x P = φ × λ / 2 π y P = ϕ × λ / 2 π
C = arg min j = 1 , ... , M [ b L ( x P , y P ) T b R , j ( x P , j , y P , j ) T 2 < d M a x ] .
d = | a x R + b y R + c a 2 + b 2 | < d T
l p r = F [ m L 1 ]
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.