Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Time efficient color fringe projection system for 3D shape and color using optimum 3-frequency Selection

Open Access Open Access

Abstract

We present a novel color fringe projection system to obtain absolute 3D shape and color of objects simultaneously. Optimum 3-frequency interferometry is used to produce time efficient analysis of the projected fringes by encoding three fringe sets of different pitch into the primary colors of a digital light projector and recording the information on a 3-chip color CCD camera. Phase shifting analysis is used to retrieve sub-wavelength phase information. Absolute phase across the field is calculated using the 3-frequency method independently at each pixel. Concurrent color data is also captured via the RGB channels of the CCD. Thus full-field absolute shape (XYZ) and color (RGB) can be obtained. In this paper we present the basis of the technique and preliminary results having addressed the issue of crosstalk between the color channels.

©2006 Optical Society of America

1. Introduction

In the past 20 years, optical metrology has found numerous applications in scientific [1] and commercial fields [2, 3] owing to its inherent non-intrusive nature. One of the most widely researched topics has been the measurement of 3D surface form with fringe projection techniques becoming popular due to their flexibility in adapting to different length scales. Sub-fringe phase measurement modulo 2π is obtained routinely using phase stepping [4] and Fourier transform techniques [56]. In recent years considerable research has been performed on absolute fringe order identification. Temporal unwrapping techniques have become dominant where a sequence of fringe patterns are projected with varying fringe frequency [710]. The authors recently introduced a robust optimization process for frequency selection in multi-wavelength interferometry where a geometric series of synthetic wavelengths is defined to maximize the overall process reliability [1113]. The limited lateral resolution of commercial digital light processing (DLP) video projectors and CCD cameras restricts the maximum number of projected fringes to ~250. We have demonstrated that three projected fringe frequencies are required to determine fringe order with 6σ reliability over this number of fringes using optimum frequency selection [14].

Color has also been used as a means of encoding depth information for 3D shape measurement. Hausler et al presented a color-coded triangulation method, which is simple, fast, and without moving mechanical parts. Whilst this approach uses a single frame, the dynamic range is limited to <200 [15]. Huang et al proposed a color-encoded fringe projection technique using three patterns, one in each color channel, with a phase shift of 2π/3 between neighboring channels. The 3D surface contour information can be retrieved from a single image snapshot of the object surface [16]. Skydan et al presented a method with up to three fringe patterns projected on the three primary color channels from three different video projectors at different viewpoints to overcome shadowing effects on the object [17]. However, the approaches described by Huang et al. and Skydan et al only produce wrapped phase maps, so spatial phase unwrapping methods are required to generate the 3D shape of an object and hence these methods cannot be applied to objects with large slope or discontinuity.

Kakunai et al. introduced a method to project two gratings simultaneously with different color and fringe pitch [18]. The color allowed the image of each grating to be captured by separate monochrome cameras via appropriate color filters. However, this approach required the images from the two cameras to be accurately registered which becomes increasingly complex for larger numbers of projected fringes. Pfortner et al used a three-chip color camera to simultaneously record the interference patterns at three wavelengths generated by three different lasers in a classical interferometer, although, this approach is better suited to measurements of nanometer resolution over 10’s to 100’s of microns [19].

In this paper we present a novel approach to measure the color (herein this term is used to denote the true reflectance of an object) and shape of an object simultaneously. The synergy between requiring data at three projected fringe frequencies for absolute shape measurement and there being three primary color channels (RGB) is exploited such that parallel acquisition of the fringe data is obtained as well as giving the potential to measure surface color. By recording the data on a 3-chip color CCD each fringe set is automatically registered. Here the colors are used as an information carrier and not as an interferometric wavelength. The numbers of fringes used in each channel is determined by the optimum multi-frequency process allowing >100 fringe orders to be measured absolutely with a phase resolution of order 1/100th of a fringe [14]. The modulation depth in each color channel provides information on the red, green and blue color content of the surface at each pixel. For test objects where there is sufficient reflected light imaged in each channel, all the data required for color and shape measurement can be obtained in four frames when the sub-fringe phase resolution is obtained by phase stepping. With Fourier transform fringe analysis the same measurement maybe obtained from a single frame. The automatic control and analysis software operates such that for objects with poor reflectivity in a particular color channel a further sequence of four phase stepped images are acquired to obtain a wrapped phase map at the required fringe frequencies via the other color channels. This paper describes the setup and configuration of the optical system such that crosstalk between the color channels is mitigated. Experimental measurements are presented showing the absolute phase from a number of objects.

2. Principle

 figure: Fig. 1.

Fig. 1. Layout and principle of the measuring system. The system includes a DLP video projector, a 3-chip color CCD camera, and a personal computer. F: Firewire port

Download Full Size | PDF

Figure 1 shows the layout of the optical system with a DLP video projector [20], a 3-chip color CCD camera, and a personal computer (PC). A color image whose RGB components are three fringe patterns with different spatial frequency is generated in the PC and projected onto an object surface by the DLP projector. As the phase stepped fringe patterns are generated in software, accurate phase shifts are obtained without miscalibration. The 3-CCD camera captures an image of the fringe pattern from an angularly displaced viewpoint compared to the projector and the images are saved into the computer for post processing. In this paper, four phase stepped images are used with a phase step of π/2 and the wrapped phase map calculated for each channel. Using the wrapped phase maps from the three channels, the absolute phase distribution is calculated via the fringe order and the optimum 3-frequency selection method.

2.1 Optimum three-frequency selection

In full-field fringe projection, the number of fringes projected, Nf , is related to the effective wavelength produced by the projector, λ, by Nf =L/λ, where L is the desired unambiguous measurement range, in this case the field of view of the projector. The optimum frequency selection process defines the numbers of projected fringes to be [14]:

Nfi=Nf0(Nf0)(i1)/(n1),fori=1,,n1,

where Nf0 and Nfi are the maximum number of fringes and the number of fringes in the ith fringe set, respectively, and n is the number of fringe sets used. When the maximum number of fringes is 100 and n=3, N f1=99 and N f2=90. This approach resolves fringe order ambiguity as the beat obtained between N f0 and N f1 is a single fringe over the full field of view.

2.2 Fringe projection

Since the fringe pattern is digitally created in the computer, the fringe parameters are under direct control such as pitch, phase, modulation and DC intensity for each color channel. The red, green, and blue sinusoidal fringe patterns are generated in the computer as:

Ic(x,y)=DCc+Mc(2πxpc+φ),

where c=r, g, b corresponds to the red, green, and blue channels, respectively, Ic is the gray value, DCc is the average intensity, Mc is the fringe amplitude, pc is the fringe pitch in pixels, (x, y) are the horizontal and vertical pixel indices of the DLP, and φ is the phase shift. The three image components combine together in a color image comprising the red, green, and blue channels, as shown in Fig. 2. The 3-CCD camera captures the composite fringe pattern into three channels as:

Ic(m,n)=DCc(m,n)+Mc(m,n)[ϕc(m,n)+φ],

where m,n are the pixel indices of the CCD, Ic, DCc, Mc , and ϕc are the intensity, the average intensity, the modulation depth, and the phase of the captured fringe pattern, respectively.

The control software allows the average intensity and modulation depth to be defined in each color channel independently with real time display of the captured intensity profiles from the red, green and blue channels. In the work presented here a 4-frame 90 degree phase step algorithm was implemented and the modulation depth Mc calculated from each color channel [4].

2.3 Color separation

Since the proposed method utilizes the red, green, and blue channels to hold independent information it is important that there is minimal crosstalk between them in order to optimize the phase resolution of the wrapped maps produced. For most color CCD cameras and DLPs, the spectra of the red, green, and blue channels are designed to overlap so that there are no color-blind areas. However, the regions between the color bands are often at different wavelengths in CCD cameras than those in DLPs. Hence, the information captured in each of the three color channels is not independent.

 figure: Fig. 2.

Fig. 2. The central part of the captured composite fringe pattern with fringe numbers as 90, 99 and 100 in red, green and blue channels, respectively.

Download Full Size | PDF

Huang et al [16] proposed a method to compensate for the coupling effects between channels when projecting the same numbers of fringes with a phase shift between colors and found the compensation scheme reduced the crosstalk errors significantly. Here, the application of this technique has been explored when different numbers of fringes are projected on each channel. The use of new dielectric filters with sharp transitions between the color bands has also been investigated to minimize the crosstalk at source.

To quantify the coupling effects between color channels a derivative of the calculation introduced by Huang has been implemented [16]. For example, with the red channel, four pure red fringe pattern sets with π/2 phase shift are generated and projected onto a flat white surface. Four full-color images are captured and separated into their RGB components, giving twelve grayscale images. The intensity modulations Mc (m, n),c=r, g,b from the three channels are calculated and the ratios of the spatially averaged values of Mc (m, n) between channels indicate the magnitude of the coupling effects, defined for the red channel as:

Rrc=Mc(m,n)Mr(m,n),c=g,b.

A similar process is used to evaluate the coupling effects for the green and blue channels. Finally, a matrix whose elements are the values from Eq. (4) expressed as percentages is defined to represent the coupling effects between channels:

[CrrCrgCrbCgrCggCgbCbrCbgCbb],

where Cij =100×Rij, i, j=r, g,b and the first suffix denotes illumination from the projector and the second suffix the color plane in the detected image. The elements along the main diagonal in Eq. (5) are 100.0, that is Crr=Cgg=Cbb =100.0.

3. Experiments and results

3.1 Experimental System

The system comprises a DLP video projector, a 3-chip color CCD camera with firewire port and a personal computer, as shown in Fig. 1. The projector is from BenQ (Model PB6200) with one-chip digital micro-mirror device (DMD) and with a lateral resolution of up to 1024 x 768 pixels (XGA). The colors of red, green, and blue are produced by rapidly spinning a color filter wheel in the projector and synchronously modifying the state of the DMD. The 3-CCD color camera from Hitachi (Model HV-F22F) has a lateral resolution of 1360 x 1024 pixels and may be used at either 8 or 10 bit depth for each color channel. The camera has a standard zoom lens (Pentax) with focal length from 8 to 48 mm and an adjustable aperture. A personal computer (PC) provides system control. The PC graphics card is setup to drive two monitors, one for the DLP and the other for the control software and viewing the captured data.

 figure: Fig. 3.

Fig. 3. The relationship between input gray and projected intensity for the red, green, and blue channels.

Download Full Size | PDF

Due to the nonlinear intensity response of the projector, the sinusoidal fringe patterns captured by the camera are distorted reducing the potential phase resolution. Putting a calibrated precision Lightmeter (cal-LIGHT 400L from the COOKE Corporation) in front of the projector with a working distance 80cm, we measured the projected light intensity by generating the pure red, green and blue colors, as shown in Fig. 3. It shows that when the input gray levels are greater than 220 for red and blue channels, 240 for green channel, the generated intensities have saturated. Therefore, the chosen ranges of the input gray levels for the three channels are 30–220, 30–240 and 30–220, respectively.

 figure: Fig. 4.

Fig. 4. The relationship between input and captured gray for the red, green, and blue channels. (a) Before compensation and (b) after compensation.

Download Full Size | PDF

It has been found that the three color channels have different non-linear responses including different saturation levels from the projector and the camera. The camera response is significantly reduced in the red channel compared to the blue and green. It was found that to optimize the overall performance of the system, the full dynamic range of the red channel must be utilized even though this meant that the usable input grayscale range for the blue and green had to be reduced to avoid saturation, see Fig. 4(a). To calibrate the non-linear response a sequence of constant intensity images are projected onto a white board for each color channel in turn. For each intensity 10 images are captured to average shot noise effects. Figure 4(a) shows the nonlinear response for the three color channels using data from a group of pixels near the image centre. The nonlinear response can be fitted by a fourth-order polynomial (avoiding the regions near saturation and for grey levels <40). A look up table (LUT) is created by treating the captured intensities as input and the projected intensities as the output of the polynomial. The intensity range of the fringes is reduced to the region where there is reasonable variation in captured intensity, i.e. 40–230, 40–160, and 40–140 for red, green and blue, respectively. Using the calculated LUT, the captured gray levels in the three color channels have approximate linear responses, as shown in Fig. 4(b).

3.2 Evaluation of Crosstalk

In order to explore the coupling effects between channels, separate red, green, and blue fringe patterns were generated in the PC and projected onto a white surface. Because the three channels have an intensity imbalance, the F-number of the lens, gain of the CCD and white balance were adjusted to avoid saturation in the images captured by the camera whilst maximizing the grayscale usage. For each color, four phase-shifted images s (0, π/2, π, and 3π/2) were generated and projected on the white surface. We evaluated the coupling effects for the red, green, and blue channels under 3 conditions: the standard DLP and camera, using additional filters and using additional filters without the DLP built-in color filter wheel.

Using the standard DLP and camera, twelve images were captured; one of the four phase-shifted images for each color projected is displayed in Fig.5. The first column is the captured color fringe pattern, and the second, third, and fourth columns correspond to the red, green, and blue channels, respectively. From the figure, it can be seen that for each color fringe pattern the other two channels contain a weak fringe image. As the detected channel gets further away from the fringe color projected in terms of optical wavelengths, the fringe pattern almost disappears, for example, the blue channel from a red projected fringe pattern and the red channel from a blue projected fringe pattern.

The coupling matrix in Eq. (5) gives a quantitative evaluation of the coupling effects. In order to improve the accuracy of calculation, data was averaged over a 200 x 200 pixel area in the middle of the captured image to calculate the coupling effects. Without extra filters, i.e. using the built-in filters in the projector and camera, the following coupling matrix was obtained:

[100.017.56.829.1100.015.77.721.1100.0],

The coupling effects are weaker for red and blue projected fringe patterns (corresponding to the top and bottom rows of the matrix), while they are stronger for a green projected fringe pattern.

Using three additional dielectric color filters with 50% transition wavelengths of 465 nm and 650 nm for the blue and red images respectively and 540 nm centre wavelength for the green image (Comar product numbers 465 IK 50, 540 IB 50 and 650 IY 50) in front of the camera, twelve images were captured and the crosstalk matrix is now:

[100.01.51.418.2100.03.57.43.2100.0].

These results show that the coupling effects decrease dramatically when combinations of color filters are placed in front of the camera. However, this has the effect of reducing the illumination which must be compensated by increasing the gain of the camera or the aperture of the lens.

 figure: Fig. 5.

Fig. 5. Coupling effects between color channels using the standard DLP and camera. The three rows correspond to the coupling effects of the red, green, and blue channels, respectively. For each row, the first is the captured color image and the other three are the red, green, and blue components of the first.

Download Full Size | PDF

The operation of the system with only the new dielectric filters was evaluated as this represents what may be achieved with a single filter wheel and filters that are better matched to the spectral characteristics of the 3-chip CCD. For this experiment, the built-in color filter wheel in the projector was removed and each new filter inserted in turn in front of the camera. One color fringe pattern for each projected color and its three channels are displayed in Fig. 6. The coupling effects for this case are shown in the following matrix:

[100.05.76.122.0100.05.37.03.0100.0].

Again, the coupling effects for the green projected fringe pattern are larger than that of the red and blue fringe patterns. Compared to Eq. (6), the coupling effects are improved. Compared to the filter wheel in combination with the dielectric filters, Eq. (7), the results show increased crosstalk mainly in the red channel but nearly equivalent performance for the green and blue projected fringe patterns.

 figure: Fig. 6.

Fig. 6. Coupling effects between color channels with a filter in front of the camera, but without the built-in color filter wheel in the projector. The same illustrations are used as in Fig. 5.

Download Full Size | PDF

3.3 Phase noise evaluation

The presence of crosstalk between the color channels increases the noise in the phase measurements obtained. The phase noise has been evaluated for the three color channels in different situations: projecting the red, green, and blue fringe patterns separately with the standard DLP and camera, projecting the RGB fringe patterns simultaneously – a composite fringe pattern – with the standard DLP and camera, projecting a composite fringe pattern with the standard DLP and camera and compensating for the coupling effects using Huang’s method [16] and finally with the built-in color filter wheel removed but with the new dielectric color filters. Since the modulation depth affects the phase noise, the gain of the camera and the aperture of the lens were adjusted to make the captured fringe pattern have similar modulation in each color channel and for each setup. Table 1 shows the phase resolution (defined here as 2π/σϕ where σϕ is the standard deviation of the phase noise) in these situations. The phase resolutions for the standard DLP and camera with separate color projection are 153, 156, 120 for the R, G, B channels, respectively. With composite color projection crosstalk affects are present and the phase resolutions reduce to 44, 93, 121. The phase resolution in the red channel is reduced considerably as it can be seen that this channel contains the greatest amount of crosstalk (from the green channel, see the first column of the matrix in Eq. (6)). When Huang’s method was used to compensate for the coupling effects, the phase resolution for the red channel improved (75 versus 44) while it was almost the same for the green and blue channels. Therefore, we can compensate for the crosstalk in the red channel with composite fringe projection using Huang’s method.

When the built-in color filter wheel in the projector is removed, the sinusoidal fringe pattern generated is black and white. Putting the dielectric red, green and blue filters in front of the camera in turn, the phase resolutions obtained in the three color channels are 141, 180, 132 (see Table 1). The increase in phase resolution for the green channel compared to the standard DLP and camera setup is due to a combination of slightly increased fringe modulation and reduced camera gain. To simulate composite projection with the new filters, the appropriate crosstalk is introduced by adding together the images in the same color channel and with the same phase step from red, green and blue fringe projection. This process increases the intensity noise by a factor of √3. A measure of the phase resolution that would be obtained in composite fringe projection has been obtained by simulating the phase noise variation with respect to intensity noise for the same fringe modulation and offset as determined from the experiments. With the new dielectric filters the composite results are consistent with the separate fringe projection results allowing for the crosstalk matrix measured, see equation 8. The new filters offer a significant benefit in phase noise for composite projection compared to the standard DLP and camera. Because the coupling effects between channels are small, the phase resolutions in the red and green are much better than with the standard DLP and camera: red 102 versus 44, green 177 versus 93; with performance in the blue being comparable.

Tables Icon

Table 1. The phase resolution in the three color channels. For separate projection, the fringe numbers are 100 for all the three channels. For composite situation, the fringe numbers for the R, G and B channels are 90, 99 and 100, respectively.

3.4 Composite fringe pattern

For measuring static objects where time is not critical, the best results will be obtained using separate projection of the R, G and B fringe patterns with either the standard DLP and camera or the new filters. Given the smallest phase resolution as 120 in the blue channel, the scaling factor limit is 14.1 for optimum 3-frequency interferometry and to obtain 6σ reliability in fringe order calculation [14]. The limit on the maximum number of projected fringes is then Nf0 =200 giving a dynamic range up to 24000.

For composite fringe pattern projection, all the information required can be obtained in four RGB frames by using the phase stepping algorithm and in one RGB frame by using FFT phase analysis. Huang’s approach is used to reduce the coupling effects to improve the phase resolution in the red channel. Using the standard DLP and camera, the scaling factor limit is 8.8 and the maximum number of projected fringes is approximately Nf0 =78 giving an overall dynamic range up to 5850. Therefore, using optimum 3-frequency analysis, three fringe patterns were generated with 81, 80, and 72 fringes. With the optics used, it was found that the red channel contained a significant level of chromatic distortion compared to the blue and green channels. It is found that the generation of a single beat fringe is critical for successful fringe order calculation, in this case by forming a beat between 81 and 80 projected fringes. Therefore, to reduce the overall effects of chromatic distortion on the fringe order calculation the projected numbers of fringes were set to 81, 80 and 72 for the blue, green and red channels, respectively. The phase stepped composite RGB fringe patterns were generated and projected in sequence such that the phase and color information were captured in 4 frames.

A ceramic statue was profiled by projecting the composite fringe patterns onto its surface. A white plate was placed at the back of the measurement volume to test the performance of measuring objects with step height changes. Figure 7 shows the four captured color images and their component grayscale images corresponding to the red, green, and blue channels in the second, third, and fourth columns, respectively. Each phase stepped image is shown in a separate row of the figure. The corresponding wrapped phase maps for the RGB channels are shown in Fig. 8a)–c). Fig. 8d) is the absolute phase map obtained by using the proposed method with optimum 3-frequency analysis and it can be send that the phase of both the statue and the back plate have been correctly retrieved. During the process of phase calculation, pixels with a modulation less than 15 grayscales are marked as invalid and these pixels are shown in black. The noise on the right side of the unwrapped phase map is due to chromatic aberration and could be further mitigated by employing distortion correction separately on the R, G, B image channels. The modulations in the red, green and blue channels of the measured statue and the back plate were calculated, as shown in Fig. 9. Figure 10 displays a 3D representation of the shape and a pseudo color representation with appropriate lighting based on the surface gradient. The modulation images show good consistency between the color channels as expected for this object. Shadowing due to the angular offset between illumination and viewing can be overcome by incorporating data from multiple views.

 figure: Fig. 7.

Fig. 7. Four captured color images of a plastic statue and their corresponding red, green, and blue channels. The red, green, and blue channels correspond to the second, third, and fourth columns, respectively. For each channel, the four gray images have π/2 phase shift in turn.

Download Full Size | PDF

 figure: Fig. 8.

Fig. 8. The wrapped phase and phase-unwrapped maps of the measured statue. (a) red channel, (b) green channel, (c) blue channel, and (d) the absolute phase map.

Download Full Size | PDF

 figure: Fig. 9.

Fig. 9. The modulation maps in the three channels of the measured statue and the back plate. (a) red channel, (b) green channel, and (c) blue channel.

Download Full Size | PDF

 figure: Fig. 10.

Fig. 10. 3D representation of the shape and color information captured. point cloud of statue and back plate, (b) gradient shaded view of statue shape from a single view

Download Full Size | PDF

Composite fringe projection offers the potential for high speed data acquisition of both shape and color. Using the phase stepping approach described 4 image frames are needed. Other workers have reported high speed phase shifting techniques in fringe projection to give potential measurement rates of 100 Hz [21]. Alternatively, Fourier transform phase analysis has the potential to produce multi-frequency phase measurement from a single RGB frame.

4. Conclusions

In this paper we have shown that combined shape and color information can be obtained in a time efficient manner by using RGB fringe projection and a 3-chip CCD camera in combination with optimum 3-frequency interferometry analysis. The effects of crosstalk between color channels have been quantified showing that significant levels, up to 30%, are present with the standard filters in the projector and camera. It has been shown that by measuring the crosstalk a simple compensation scheme maybe implemented to increase the phase resolution obtained. Further reductions in crosstalk, and correspondingly higher phase resolution, can be achieved by using new dielectric filters in the projector that offer negligible chromatic overlap.

The presence of chromatic aberration in the projector and imaging optics introduce distortions between the images obtained in the 3 color channels. By suitable implementation of the optimum 3-frequency approach it has been shown that reliable fringe order calculation can be obtained over the majority of the field, with fringe order calculation errors confined to the extremes of the field of view. Further work will be done in this area to compensate for the chromatic distortion.

Acknowledgments

The authors would like to thank Scottish Enterprise for funding this work through the Proof of Concept scheme, grant reference 5EN-OPT002.

References and links

1. F. Chen, G. M. Brown, and M. Song, “Overview of three-dimensional shape measurement using optical methods,” Opt. Eng. 39, 10–22 (2000). [CrossRef]  

2. M. Petrov, A. Talapov, T. Robertson, A. Lebedev, A. Zhilyaev, and L. Polonskiy, “Optical 3D digitizers: bringing life to the virtual world,” IEEE Comput. Graph. Appl. 18, 28–37 (1998). [CrossRef]  

3. F. Blais, “Review of 20 years of range sensor development,” J. Electron Imaging 13, 231–240 (2004). [CrossRef]  

4. K. Creath E. Wolf, “Phase measurement interferometry techniques,” in Progress in Optics XXVI, Ed. (North Holland Publ., Amsterdam, 1988).

5. M. Takeda and K. Mutoh, “Fourier transform profilometry for the automatic measurement of 3D object shapes,” Appl. Opt. 22, 3977–3982 (1983). [CrossRef]   [PubMed]  

6. X. Y. Su and W. J. Chen, “Reliability-guided phase unwrapping algorithm: a review,” Opt. Lasers Eng. 42, 245–261 (2004). [CrossRef]  

7. J. M. Huntley and H. O. Saldner, “Temporal phase-unwrapping algorithm for automated interferogram analysis,” Appl. Opt. 32, 3047–3052 (1993). [CrossRef]   [PubMed]  

8. H. O. Saldner and J. M. Huntley, “Temporal phase unwrapping: application to surface profiling of discontinuous objects,” Appl. Opt. 36, 2770–2775 (1997). [CrossRef]   [PubMed]  

9. J. M. Huntley and H. O. Saldner, “Error-reduction methods for shape measurement by temporal phase unwrapping,” J. Opt. Soc. Am. A 14, 3188–3196 (1997). [CrossRef]  

10. H. O. Saldner and J. M. Huntley, “Shape measurement by temporal phase unwrapping: comparison of unwrapping algorithms,” Meas. Sci. Technol. 8, 986–992 (1997). [CrossRef]  

11. C. E. Towers, D. P. Towers, and J. D. C. Jones, “Optimum frequency selection in multifrequency interferometry,” Opt. Lett. 28, 887–889 (2003). [CrossRef]   [PubMed]  

12. D. P. Towers, C. E. Towers, and J. D. C. Jones, “Phase Measuring Method and Apparatus for Multi-Frequency Interferometry,” International Patent Application Number PCT/GB2003/003744.

13. C. E. Towers, D. P. Towers, and J. D.C. Jones, “Generalized frequency selection in multifrequency interferometry,” Opt. Lett. 29, 1348–1450 (2004). [CrossRef]   [PubMed]  

14. C. E. Towers, D. P. Towers, and J. D. C. Jones, “Absolute fringe order calculation using optimised multi-frequency selection in full-field porfilometry,” Opt. Lasers Eng. 43, 788–800 (2005). [CrossRef]  

15. G. Hausler and D. Ritter, “Parallel three-dimensional sensing by color-coded triangulation,” Appl. Opt. 32, 7164–7169 (1993). [CrossRef]   [PubMed]  

16. P. S. Huang, Q. Y. Hu, F. Jin, and F. P. Chiang, “Color-encoded digital fringe projection technique for high-speed three-dimensional surface contouring,” Opt. Eng. 38, 1065–1071 (1999). [CrossRef]  

17. O. A. Skydan, M. J. Lalor, and D. R. Burton, “Technique for phase measurement and surface reconstruction by use colored structured light,” Appl. Opt. 41, 6104–6117 (2002). [CrossRef]   [PubMed]  

18. S. Kakunai, T. Sakamoto, and K. Iwata, “Profile measurement taken with liquid-crystal gratings,” Appl. Opt. 38, 2824–2828 (1999). [CrossRef]  

19. A. Pfortner and J. Schwider, “Red-green-blue interferometer for the metrology of discontinuous structures,” Appl. Opt. 42, 667–673 (2003). [CrossRef]   [PubMed]  

20. J. M. Younse, “Mirrors on a chip,” IEEE Spectrum 30, 27–31 (1993). [CrossRef]  

21. P. S. Huang, C. P. Zhang, and F. P. Chiang, “High-speed 3-D shape measurement based on digital fringe projection,” Opt. Eng. 42, 163–168 (2003). [CrossRef]  

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (10)

Fig. 1.
Fig. 1. Layout and principle of the measuring system. The system includes a DLP video projector, a 3-chip color CCD camera, and a personal computer. F: Firewire port
Fig. 2.
Fig. 2. The central part of the captured composite fringe pattern with fringe numbers as 90, 99 and 100 in red, green and blue channels, respectively.
Fig. 3.
Fig. 3. The relationship between input gray and projected intensity for the red, green, and blue channels.
Fig. 4.
Fig. 4. The relationship between input and captured gray for the red, green, and blue channels. (a) Before compensation and (b) after compensation.
Fig. 5.
Fig. 5. Coupling effects between color channels using the standard DLP and camera. The three rows correspond to the coupling effects of the red, green, and blue channels, respectively. For each row, the first is the captured color image and the other three are the red, green, and blue components of the first.
Fig. 6.
Fig. 6. Coupling effects between color channels with a filter in front of the camera, but without the built-in color filter wheel in the projector. The same illustrations are used as in Fig. 5.
Fig. 7.
Fig. 7. Four captured color images of a plastic statue and their corresponding red, green, and blue channels. The red, green, and blue channels correspond to the second, third, and fourth columns, respectively. For each channel, the four gray images have π/2 phase shift in turn.
Fig. 8.
Fig. 8. The wrapped phase and phase-unwrapped maps of the measured statue. (a) red channel, (b) green channel, (c) blue channel, and (d) the absolute phase map.
Fig. 9.
Fig. 9. The modulation maps in the three channels of the measured statue and the back plate. (a) red channel, (b) green channel, and (c) blue channel.
Fig. 10.
Fig. 10. 3D representation of the shape and color information captured. point cloud of statue and back plate, (b) gradient shaded view of statue shape from a single view

Tables (1)

Tables Icon

Table 1. The phase resolution in the three color channels. For separate projection, the fringe numbers are 100 for all the three channels. For composite situation, the fringe numbers for the R, G and B channels are 90, 99 and 100, respectively.

Equations (8)

Equations on this page are rendered with MathJax. Learn more.

N fi = N f 0 ( N f 0 ) ( i 1 ) / ( n 1 ) , for i = 1 , , n 1 ,
I c ( x , y ) = D C c + M c ( 2 π x p c + φ ) ,
I c ( m , n ) = D C c ( m , n ) + M c ( m , n ) [ ϕ c ( m , n ) + φ ] ,
R rc = M c ( m , n ) M r ( m , n ) , c = g , b .
[ C rr C rg C rb C gr C gg C gb C br C bg C bb ] ,
[ 100.0 17.5 6.8 29.1 100.0 15.7 7.7 21.1 100.0 ] ,
[ 100.0 1.5 1.4 18.2 100.0 3.5 7.4 3.2 100.0 ] .
[ 100.0 5.7 6.1 22.0 100.0 5.3 7.0 3.0 100.0 ] .
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.