Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Multi-frequency color-marked fringe projection profilometry for fast 3D shape measurement of complex objects

Open Access Open Access

Abstract

We propose a novel multi-frequency color-marked fringe projection profilometry approach to measure the 3D shape of objects with depth discontinuities. A digital micromirror device projector is used to project a color map consisting of a series of different-frequency color-marked fringe patterns onto the target object. We use a chromaticity curve to calculate the color change caused by the height of the object. The related algorithm to measure the height is also described in this paper. To improve the measurement accuracy, a chromaticity curve correction method is presented. This correction method greatly reduces the influence of color fluctuations and measurement error on the chromaticity curve and the calculation of the object height. The simulation and experimental results validate the utility of our method. Our method avoids the conventional phase shifting and unwrapping process, as well as the independent calculation of the object height required by existing techniques. Thus, it can be used to measure complex and dynamic objects with depth discontinuities. These advantages are particularly promising for industrial applications.

© 2015 Optical Society of America

1. Introduction

Considerable research has been conducted on using 3D measurement in industrial precision inspection and applying high technology to address this topic. Optical methods such as fringe projection, stereo vision, coded light, and phase shift have been gradually adopted to object measurement and reconstruction to meet demand for increased speed, higher precision, and no contact [1]. Among the various available methods, the fringe projection-based approaches have become popular in recent years since they are easy to operate and their results are sufficiently precise [2]. A typical fringe projection system consists of a digital micromirror device (DMD) projector and a charge-coupled device (CCD) camera. The projector generates the structured light, while the reflected and distorted fringes are recorded by the CCD camera. A number of methods such as phase measuring profilometry (PMP) [3,4 ] and Fourier transform profilometry (FTP) [5,6 ] can be used to calculate the phase map related to the target object. However, because of the periodicity of the projected sinusoidal fringes, the existing phase unwrapping approaches are only effective under the condition that the relative phase maps are continuous. Hence, when the measured object has large-depth discontinuities, conventional unwrapping algorithms lose their efficacy.

Color fringe projection profilometry has recently become an alternative method for object shape reconstruction, providing more information in a single image [7–11 ]. Da and Huang proposed a novel color fringe projection based on a Fourier transform 3D shape measurement method using the R, G, and B components of the color fringe. This eliminates the background and high frequency noise, spectrum overlapping, and phase unwrapping problems of Fourier transforms [12]. Besides, Zhang et al. took advantage of the three primary channels (RGB) to project three fringe patterns of different frequencies to beneficially calculate the absolute phase [13]. However, these approaches cannot overcome the innate drawbacks of their unwrapping processes, which cannot measure objects with large depth discontinuities. Still, 3D profile measurement based on the color-encoded method can be a potential solution. For example, Chen et al. described a surface height retrieval technique based on fringe shifting of a color-encoded structured light pattern combining the temporal encoding of color stripes with local spatial shifting of multiple fringes [14]. In addition, Pan et al. described a color gray-code fringe projection technique to reduce the decoding errors existing in the measurement [15]. Although they apply to objects with large depth discontinuities, these methods may not be suitable for the dynamic measurement since many images are needed to achieve enough measurement volume and resolution.

Xu presented a multi-frequency fringe projection profilometry approach using a peak searching algorithm to independently calculate the height of each point on the object based on the phase information from a series of fringe patterns with different frequency [16,17 ]. However, since multiple images are required for sufficient height information, this method requires more time and cannot be capable of measuring the moving objects.

In this research, we propose a novel multi-frequency color-marked fringe projection profilometry method. In contrast with existing multi-frequency fringe projection profilometry methods, multiple fringe patterns with different frequencies are simultaneously projected onto the object instead of one-by-one. Moreover, each fringe pattern is marked by a different color, thereby establishing a one-to-one relationship between the object height and the color created in a color space. A chromaticity curve in CIE1976UCS color space is employed based on the calculation of the color change derived from the object height. The related correction method is also used to eliminate the influence of the color fluctuation and measurement error on the chromaticity curve and object reconstruction. Because our method avoids the unwrapping process required for conventional fringe projection profilometry, and only requires one color-marked fringe patterns, it is valuable for the measurement of complex, discontinuous, and dynamic objects.

2. Principle of the multi-frequency color-marked fringe projection profilometry

2.1 Emergence of the color map

Our multi-frequency color-marked fringe projection profilometry system consists of a computer, a DMD projector, a reference plane, and a 3 CCD RGB color camera, as shown in Fig. 1 . The color map superposed by a series of different frequency fringe patterns illuminated onto the object is captured by the camera once reflected from the object surface.

 figure: Fig. 1

Fig. 1 System layout of our multi-frequency color marked fringe projection profilometry.

Download Full Size | PDF

The color map projected onto the object is obtained by directly superimposing a number of fringe pattern images with different frequencies and colors. A fringe pattern can be represented by the following formula

I(x,y)=a(x,y)+b(x,y)cos[2πf0x+φI(x,y)],
where I(x,y) is the fringe pattern projected from the DMD projector; a(x,y)and b(x,y) represent the background intensity and modulation intensity of the fringe pattern respectively; f0 is the fundamental frequency of the sinusoidal fringe while φI(x,y) is the initial phase.

On the foundation of chromaticism, the appearance of any color can be simulated by starting with black and adding certain intensities of red (R), green (G), and blue (B) light. The color projected onto the object depends on the proportion of the intensity in its R, G, and B channels. Hence, a particular frequency fringe pattern with any color can be obtained by changing the background and modulation intensity of the fringe pattern in the R, G, and B channels, expressed as follows:

Rf0(x,y)=rf0·a(x,y)+rf0·b(x,y)cos[2πf0x+φI(x,y)]
Gf0(x,y)=gf0·a(x,y)+gf0·b(x,y)cos[2πf0x+φI(x,y)]
Bf0(x,y)=bf0·a(x,y)+bf0·b(x,y)cos[2πf0x+φI(x,y)],
where Rf0, Gf0, and Bf0 are the patterns formed in the R, G and B channels while rf0, gf0and bf0 represent the proportion of different colors in their R, G and B channels; x denotes the direction perpendicular to the fringe pattern. In this research, the color map consists of seven fringe patterns with different frequencies, each marked by a single color. To facilitate the distinction between each fringe pattern, we use seven typical colors (three primary colors, their complementary colors, and an additional color): green(0,1,0), red(1,0,0), blue(0,0,1), yellow(1,1,0), cyan(0,1,1), magenta(1,0,1), and a particular color (0.5,0.4,0.3). The color map consisting of seven color-marked fringe patterns can be written as:
RI(x,y)=i=17RIfi(x,y)=i=17{rfi·a(x,y)+rfi·b(x,y)cos[2πfix+φIi(x,y)]}
GI(x,y)=i=17GIfi(x,y)=i=17{gfi·a(x,y)+gfi·b(x,y)cos[2πfix+φIi(x,y)]}
BI(x,y)=i=17BIfi(x,y)=i=17{bfi·a(x,y)+bfi·b(x,y)cos[2πfix+φIi(x,y)]}.
HereRI,GI, and BI are the composite fringe patterns with 7 frequencies in the R, G and B channels of the final color map, respectively.

Figure 2(a) illustrates one of the seven fringe patterns projected onto the object (marked by blue). Figure 2(b) shows the color map consisting of all seven of the fringe patterns which are each marked by a different color. The color map is generated from 7 color fringe patterns and the color of a single fringe pattern depends on its 3 R, G and B value.

 figure: Fig. 2

Fig. 2 (a) The fringe pattern with a single frequency marked by blue. (b) The color map projected from the DMD projector.

Download Full Size | PDF

2.2 Geometrical relationship between the object height and color difference

Figure 3(a) illustrates the optical geometry of a single frequency fringe projection profilometry without considering color. Suppose that axes Y and X are the horizontal and vertical directions of the reference plane, respectively (see Fig. 3(b)); h(x,y)is the height of one point on the surface of the object anddrepresents the optic center of the distance between the CCD camera and the DMD projector. There are some restrictions about the positions of the CCD camera, the projector and the object in Fig. 3(a). One of the most important restrictions is that the line between the optic centers of the camera and the projector should be parallel to the reference plane. However, the influence from these erection errors on the final reconstructed results could be reduced by a calibration process with a flat calibration plate.

 figure: Fig. 3

Fig. 3 (a) The optical geometry of the fringe projection profilometry with a single frequency. (b) The explanation of the coordinates in the measurement system.

Download Full Size | PDF

Once an object is placed on the reference plane, the fringe pattern reflected from the object surface and observed by the CCD camera can be written as

O(x,y)=a(x,y)+b(x,y)cos[2πf0x+φI(x,y)+Δφ(x,y)].
Here O(x,y) represents a fringe pattern with a single frequency and φ(x,y) is the phase shift caused by the object height. From the geometrical relationship shown in Fig. 2, φ(x,y) can be expressed and simplified as [16]
Δφ(x,y)=2πf0d·h(x,y)/l0.
Then, the observed fringe pattern O(x,y)reflected from the object can be calculated by
O(x,y)=a(x,y)+b(x,y)cos[2πf0(x+d·H(x,y)/l0)+φI(x,y)].
Hence, the fringe patterns in the R, G, and B channels of the color map reflected from the object surface become
RO(x,y)=i=1nROfi(x,y)=i=17{rfi·a(x,y)+rfi·b(x,y)cos[2πfi(x+d·H(x,y)/l0)+φIi(x,y)]}
GO(x,y)=i=1nGOfi(x,y)=i=17{gfi·a(x,y)+gfi·b(x,y)cos[2πfi(x+d·H(x,y)/l0)+φIi(x,y)]}
BO(x,y)=i=1nBOfi(x,y)=i=17{bfi·a(x,y)+bfi·b(x,y)cos[2πfi(x+d·H(x,y)/l0)+φIi(x,y)]}.
HereRO(x,y),GO(x,y), and BO(x,y) are the composite fringe patterns with 7 frequencies in the R, G and B channels of the final color map reflected from the object, respectively. ROfi, GOfi, and BOfi are the patterns of a given frequency from the object surface in the R, G and B channels.

From Eqs. (11)–(13) , it can be noted that the height H(x,y)can be regarded as a change ofx, meaning that the change of the RGB value caused by the object height conforms to the relationship between x and the RGB value. The height of the point on the object surface can be calculated as

H(x,y)=Δx·l02l1·d.
Here, Δx is the pixel change derived from the object height and l1 represents the distance from the optic center of the camera lens to the target surface plane of the camera. We use the CIE1976UCS color space because of its convenient display of the color change. Since this color space is uniform, a one-to-one relationship is established between the color change and the height of the measured object. Based on the conversion formula between RGB and UV color spaces, the UV chromatic value of the color map in the vertical direction (x direction) can be obtained as
u,(x)=11.075568R(x)+7.006992G(x)+4.52064B(x)17.768892R(x)+70.781772G(x)+18.814536B(x)
v,(x)=9R(x)+41.3163G(x)+0.5409B(x)17.768892R(x)+70.781772G(x)+18.814536B(x),
where u'(x) and v'(x) are the chromaticity coordinates of CIE1976UCS andR(x), G(x), and B(x) are the intensity in the R, G, and B channels of one pixel in the 3 CCD color camera.

If the chromaticity coordinates of a column of x in the color map are displayed in a chromaticity diagram, a chromaticity curve, which describes the color difference in the x direction, is obtained as is shown in Fig. 4 .

 figure: Fig. 4

Fig. 4 The chromaticity curve of a column of x in the color map.

Download Full Size | PDF

If the amount ofxinformation is sufficient, the chromatic value of any point, regardless of the point on the surface of the object or reference plane, is on this chromaticity curve. This means that the chromaticity curve can be used as a surveyor's rod to measure the positon of the chromatic values from the reference plane and the object surface at the same CCD pixel point. The difference between these two chromatic values is that the latter one can move a number of x (x) which are related to the height of the measured point of the object. x is easily obtained by computing the number of x between the chromaticity values of the same pixel from the two captured images (the image of the reference plane and the object surface captured by the color camera). The accuracy of the calculatedxcan be rendered sufficiently accurate by reducing the interval between x values and interpolating appropriately.

Figure 5 illustrates the principle mentioned above. The red points in Fig. 5 represent the chromatic value of a series of x in Eqs. (11)–(13) .

 figure: Fig. 5

Fig. 5 The schematic diagram of thexcalculation.

Download Full Size | PDF

From this Fig., it can be concluded that the color of the intersection will correspond to multiple height values if the chromaticity curve is crossed, making the measurement algorithm problematic. Hence, the chromaticity curve should be uncrossed. The number of x in the uncrossed area of the chromaticity curve decides the range of the system and the interval between xdecides the system resolution. From Eqs. (5)–(7) , it can be noted that the R, G and B value of the final composite color map will be different if there are changes in the spatial frequency fi, the number of the fringe pattern i and the proportion of different colors in its R, G and B channels rf0, gf0and bf0. Hence, the shape of the curve depends on the choice of the color used, the spatial frequency chosen and the number of the fringe pattern adopted.

2.3 Correction method of the chromaticity curve

During the actual measurement, statistical noise such as CCD amplifier noise, shot noise, and the nonlinear response of the CCD pixel can cause the RGB values of two same-colored points on the surface of the object to be different and inaccurate. This can lead to the chromatic value of the measured point on the object being outside of the standard chromaticity curve, degrading the recovery results. Hence, the measured color should be corrected and calibrated.

Figure 6 illustrates the comparison of the chromaticity curves between the measured pixel colors and the simulated standard pixel colors in the x direction. The chromaticity curve of the measured pixel colors fluctuates and is uneven. The chromatic value differences between the measured and simulated colors are substantial and hinder the calculation of the color change between the point on the surface of the object and its corresponding point on the reference plane. This severely impacts the recovery of the measured object.

 figure: Fig. 6

Fig. 6 Comparison between the chromaticity curves of the measured color and the simulated color in the x direction.

Download Full Size | PDF

Assuming that the position of the pixels in the color camera contributes negligibly to the color error of the chromaticity curve, one correction method is described, as is shown in Fig. 7 .

 figure: Fig. 7

Fig. 7 (a) An illustration of the correction method. (b) The chromaticity curves from the pixels in column 300 of the reference plane and the object surface without correction. (c) The chromaticity curves from the pixels in column 300 of the reference plane and the object surface after correction.

Download Full Size | PDF

Since the simulated standard chromatic values corresponding measured values of every pixel in the CCD from the reference plane are known, the measured color from the reference plane can simply be corrected directly, as is shown in Fig. 7(a). By recording the chromatic value changes of each pixel, we can form a lookup-table such that the chromatic value of the object surface at each pixel can be corrected by finding its nearest chromaticity coordinates of the measured color from the reference plane. It can then be subjected to the same chromatic value change. Figure 7(b) illustrates the chromaticity curves of the reference plane and object surface without correction and Fig. 7(c) illustrates the same chromaticity curves after correction. It can be noted that the adopted correction method greatly improves the quality of the chromaticity curve.

3. Experimental results

3.1 Simulation Results for the multi-frequency color-marked fringe projection profilometry

We performed several simulations to validate the proposed multi-frequency color-marked fringe projection profilometry approach. Figure 8 shows a set of representative results. The scene consists of a hemisphere with a radius of 100 pixels (see Fig. 8(a)). The colors used to mark each fringe pattern of different frequency are green(0,1,0), red(1,0,0), blue(0,0,1), yellow(1,1,0), cyan(0,1,1), magenta(1,0,1), and an additional color (0.5,0.4,0.3). Figure 8(b) illustrates the simulated color map reflected from the surface of the hemisphere. Figure 8(c) shows the 3D shape of the hemisphere reconstructed using the recovery method described above. Figure 8(d) illustrates the error between the simulated and reconstructed shape. It can be noted from the Figs. that the hemisphere is well reconstructed with small error. These Figs. demonstrate that the reconstructed hemisphere matches the simulated one to be measured and validate the multi-frequency color-marked fringe projection profilometry mentioned in this paper. One of the possible solutions for reducing the reconstruction error would be decrease the interval between x values and interpolate appropriately.

 figure: Fig. 8

Fig. 8 Simulation results for multi-frequency color-marked fringe projection profilometry. (a) The simulated hemisphere (with radius 100) to be measured. (b) The simulated color map reflected from the surface of the hemisphere. (c) The reconstructed 3D shape of the hemisphere. (d) The error between the simulated and reconstructed hemisphere.

Download Full Size | PDF

3.2 Experimental Results for the multi-frequency color-marked fringe projection profilometry

Figure 9 shows the experimental results for a white block with a central stripe depression (work piece) as the object to measure (see Fig. 9(a)). The color map was projected onto the object by a DMD projector and the reflected pattern was acquired by a 3 CCD RGB color camera. Owing to the complex calculation of the chromaticity curve, a series of frequencies and colors were attempted and the chromaticity curve derived from the frequencies and colors listed was the best one which has the largest uncrossed area. The colors chosen to constitute the color map and their corresponding frequencies are listed in Table 1 . Since only one image is required to record enough information from the object surface during a single measurement, the measuring time of our method is about 10s.

 figure: Fig. 9

Fig. 9 (a) The object used in the experiment. (b) The color map reflected from the object. (c) The chromaticity curves of column 200 in the color maps of the reference plane and the object surface. (d) The reconstructed 3D shape of the object. (e) The section drawing of the reconstructed 3D shape of the object.

Download Full Size | PDF

Tables Icon

Table 1. The colors and corresponding frequency of the fringe patterns adopted in the experiment.

Figure 9(b) shows the color map reflected from the object surface and Fig. 9(c) depicts the chromaticity curves of a column in the color maps of the reference plane and the object surface. Figure 9(d) illustrates the 3D profiling of the target object and one of its section drawings is illustrated in Fig. 9(e). A thin layer of paper was attached to the object in order to reduce the effect of the object color on the final results. After the paper was attached, the real height of the object was measured by vernier calipers. For the higher step of the object, the real height is 6.512cm and the average value of the reconstructed results is 6.501cm. For the lower step of the object, the real height is 5.510cm and the average value of the reconstructed results is 5.479cm. The measurement accuracy and the evenness of the reconstructed surface are related to several factors including the influence of shadows, the pixel size, the geometric distortion of the color CCD camera, and the inherent color of the target object. The accuracy may be improved by increasing the spatial resolution, reducing the distortion of the CCD camera, and using color compensation to relieve the effect of the object’s inherent color.

In order to better validate the potential of our method on the measurement of the object with a certain complexity, a white mask was used as an object which is shown in Fig. 10(a) . Figure 10(b) depicts the color map reflected from the mask and Fig. 10(c) presents the chromaticity curves of a column in the color maps of the reference plane and the object surface. The corresponding reconstructed result is illustrated in Fig. 10(d). Although there are some errors in the final result which are derived from the influence of the shadow on the object surface, the main shape features are well presented.

 figure: Fig. 10

Fig. 10 (a) The white mask used in the experiment. (b) The color map reflected from the object. (c) The chromaticity curves of column 234 in the color maps of the reference plane and the object surface. (d) The color image of the reconstructed result.

Download Full Size | PDF

4. Conclusion

We propose a color-marked fringe projection profilometry approach using multiple frequencies. We use a relative chromaticity curve and describe the algorithm to reconstruct object height. Since the conventional phase shifting and unwrapping processes are avoided and only one image is needed theoretically during a single measurement, our method has the advantage of measuring the objects with certain depth discontinuities in a short time. Furthermore, the usages of the multi-frequency fringe patterns, the chromaticity curve and its correction method are beneficial to reducing the reconstructed error. Although the quality of the final results is limited to the shadow and the color of the object surface, it could be improved by the introduction of color correction and compensation.

The simulation and experimental results demonstrate the multi-faceted value of our method. The accuracy of the result is actually improved owing to the chromaticity curve correction method. In contrast to existing methods, our method does not require phase shifts, phase unwrapping, or multiple images with different frequencies. Consequently, our technique is theoretically suitable for efficiently and effectively measuring target objects with a reasonable accuracy and it has substantial potential for industrial applications.

Acknowledgments

This work is supported by the National Natural Science Foundation of China (Grant No. 51575437) and the National High Technology Research and Development Program of China (2015AA020303).

References and links

1. N. D’Apuzzo, “Overview of 3D surface digitization technologies in Europe,” Proc. SPIE 6056, 605605 (2006). [CrossRef]  

2. E. Li, X. Peng, J. Xi, J. Chicharo, J. Yao, and D. Zhang, “Multi-frequency and multiple phase-shift sinusoidal fringe projection for 3D profilometry,” Opt. Express 13(5), 1561–1569 (2005). [CrossRef]   [PubMed]  

3. V. Srinivasan, H. C. Liu, and M. Halioua, “Automated phase-measuring profilometry of 3-D diffuse objects,” Appl. Opt. 23(18), 3105–3108 (1984). [CrossRef]  

4. X.-Y. Su, W.-S. Zhou, G. von Bally, and D. Vukicevic, “Automated phase-measuring profilometry using defocused projection of a Ronchi grating,” Opt. Commun. 94(6), 561–573 (1992). [CrossRef]  

5. M. Takeda and K. Mutoh, “Fourier transform profilometry for the automatic measurement of 3-D object shapes,” Appl. Opt. 22(24), 3977–3982 (1983). [CrossRef]   [PubMed]  

6. X. Y. Su and W. Chen, “Fourier transform profilometry: a review,” Opt. Lasers Eng. 35(5), 263–284 (2001). [CrossRef]  

7. P. S. Huang and Q. Y. Hu, “Color-encoded digital fringe projection technique for high-speed three-dimensional surface contouring,” Opt. Eng. 38(6), 1065–1071 (1999). [CrossRef]  

8. W. H. Su, “Color-encoded fringe projection for 3D shape measurements,” Opt. Express 15(20), 13167–13181 (2007). [CrossRef]   [PubMed]  

9. Y. H. Yeh, I. C. Chang, C. L. Huang, W.-J. Hsueh, H.-C. Lin, and C.-C. Chen, etc., “A new fast and high-resolution 3D imaging system with color structured light,” Proc. SPIE 4925, 645–654 (2002). [CrossRef]  

10. Z. H. Zhang, D. P. Towers, and C. E. Towers, “Snapshot color fringe projection for absolute three-dimensional metrology of video sequences,” Appl. Opt. 49(31), 5947–5953 (2010). [CrossRef]  

11. S. Zhang and S. T. Yau, “Simultaneous three-dimensional geometry and color texture acquisition using a single color camera,” Opt. Eng. 47(12), 123604 (2008). [CrossRef]  

12. F. Da and H. Huang, “A novel color fringe projection based Fourier transform 3D shape measurement method,” Optik (Stuttg.) 123(24), 2233–2237 (2012). [CrossRef]  

13. Z. Zhang, C. E. Towers, and D. P. Towers, “Time efficient color fringe projection system for 3D shape and color using optimum 3-frequency Selection,” Opt. Express 14(14), 6444–6455 (2006). [CrossRef]   [PubMed]  

14. H. J. Chen, J. Zhang, and J. Fang, “Surface height retrieval based on fringe shifting of color-encoded structured light pattern,” Opt. Lett. 33(16), 1801–1803 (2008). [CrossRef]   [PubMed]  

15. J. H. Pan, P. S. Huang, and F. P. Chiang, “Color-coded binary fringe projection technique for 3-D shape measurement,” Opt. Eng. 44(2), 023606 (2005). [CrossRef]  

16. Y. Xu, S. H. Jia, X. Luo, J. Yang, and Y. Zhang, “Multi-frequency projected fringe profilometry for measuring objects with large depth discontinuities,” Opt. Commun. 288, 27–30 (2013). [CrossRef]  

17. Y. Xu, S. Jia, Q. Bao, H. Chen, and J. Yang, “Recovery of absolute height from wrapped phase maps for fringe projection profilometry,” Opt. Express 22(14), 16819–16828 (2014). [CrossRef]   [PubMed]  

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (10)

Fig. 1
Fig. 1 System layout of our multi-frequency color marked fringe projection profilometry.
Fig. 2
Fig. 2 (a) The fringe pattern with a single frequency marked by blue. (b) The color map projected from the DMD projector.
Fig. 3
Fig. 3 (a) The optical geometry of the fringe projection profilometry with a single frequency. (b) The explanation of the coordinates in the measurement system.
Fig. 4
Fig. 4 The chromaticity curve of a column of x in the color map.
Fig. 5
Fig. 5 The schematic diagram of the x calculation.
Fig. 6
Fig. 6 Comparison between the chromaticity curves of the measured color and the simulated color in the x direction.
Fig. 7
Fig. 7 (a) An illustration of the correction method. (b) The chromaticity curves from the pixels in column 300 of the reference plane and the object surface without correction. (c) The chromaticity curves from the pixels in column 300 of the reference plane and the object surface after correction.
Fig. 8
Fig. 8 Simulation results for multi-frequency color-marked fringe projection profilometry. (a) The simulated hemisphere (with radius 100) to be measured. (b) The simulated color map reflected from the surface of the hemisphere. (c) The reconstructed 3D shape of the hemisphere. (d) The error between the simulated and reconstructed hemisphere.
Fig. 9
Fig. 9 (a) The object used in the experiment. (b) The color map reflected from the object. (c) The chromaticity curves of column 200 in the color maps of the reference plane and the object surface. (d) The reconstructed 3D shape of the object. (e) The section drawing of the reconstructed 3D shape of the object.
Fig. 10
Fig. 10 (a) The white mask used in the experiment. (b) The color map reflected from the object. (c) The chromaticity curves of column 234 in the color maps of the reference plane and the object surface. (d) The color image of the reconstructed result.

Tables (1)

Tables Icon

Table 1 The colors and corresponding frequency of the fringe patterns adopted in the experiment.

Equations (16)

Equations on this page are rendered with MathJax. Learn more.

I ( x , y ) = a ( x , y ) + b ( x , y ) cos [ 2 π f 0 x + φ I ( x , y ) ] ,
R f 0 ( x , y ) = r f 0 · a ( x , y ) + r f 0 · b ( x , y ) cos [ 2 π f 0 x + φ I ( x , y ) ]
G f 0 ( x , y ) = g f 0 · a ( x , y ) + g f 0 · b ( x , y ) cos [ 2 π f 0 x + φ I ( x , y ) ]
B f 0 ( x , y ) = b f 0 · a ( x , y ) + b f 0 · b ( x , y ) cos [ 2 π f 0 x + φ I ( x , y ) ] ,
R I ( x , y ) = i = 1 7 R I f i ( x , y ) = i = 1 7 { r f i · a ( x , y ) + r f i · b ( x , y ) cos [ 2 π f i x + φ I i ( x , y ) ] }
G I ( x , y ) = i = 1 7 G I f i ( x , y ) = i = 1 7 { g f i · a ( x , y ) + g f i · b ( x , y ) cos [ 2 π f i x + φ I i ( x , y ) ] }
B I ( x , y ) = i = 1 7 B I f i ( x , y ) = i = 1 7 { b f i · a ( x , y ) + b f i · b ( x , y ) cos [ 2 π f i x + φ I i ( x , y ) ] } .
O ( x , y ) = a ( x , y ) + b ( x , y ) cos [ 2 π f 0 x + φ I ( x , y ) + Δ φ ( x , y ) ] .
Δ φ ( x , y ) = 2 π f 0 d · h ( x , y ) / l 0 .
O ( x , y ) = a ( x , y ) + b ( x , y ) cos [ 2 π f 0 ( x + d · H ( x , y ) / l 0 ) + φ I ( x , y ) ] .
R O ( x , y ) = i = 1 n R O f i ( x , y ) = i = 1 7 { r f i · a ( x , y ) + r f i · b ( x , y ) cos [ 2 π f i ( x + d · H ( x , y ) / l 0 ) + φ I i ( x , y ) ] }
G O ( x , y ) = i = 1 n G O f i ( x , y ) = i = 1 7 { g f i · a ( x , y ) + g f i · b ( x , y ) cos [ 2 π f i ( x + d · H ( x , y ) / l 0 ) + φ I i ( x , y ) ] }
B O ( x , y ) = i = 1 n B O f i ( x , y ) = i = 1 7 { b f i · a ( x , y ) + b f i · b ( x , y ) cos [ 2 π f i ( x + d · H ( x , y ) / l 0 ) + φ I i ( x , y ) ] } .
H ( x , y ) = Δ x · l 0 2 l 1 · d .
u , ( x ) = 11.075568 R ( x ) + 7.006992 G ( x ) + 4.52064 B ( x ) 17.768892 R ( x ) + 70.781772 G ( x ) + 18.814536 B ( x )
v , ( x ) = 9 R ( x ) + 41.3163 G ( x ) + 0.5409 B ( x ) 17.768892 R ( x ) + 70.781772 G ( x ) + 18.814536 B ( x ) ,
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.