Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Three-dimensional reconstruction from a fringe projection system through a planar transparent medium

Open Access Open Access

Abstract

A vision measurement system is placed in a protective cover made of a transparent medium to avoid environmental influences. Due to the deflection of light rays on the front and rear surfaces of the transparent medium, the imaging position of an object on the camera target plane is deviated, which makes the traditional vision detection methods based on the triangulation principle produce large measurement errors. This work introduces a three-dimensional (3D) reconstruction method by fringe projection system through a planar transparent medium. We derive the coordinate transformation relationship between a real-object point and a pseudo-object point caused by light refraction based on Snell’s law of flat refraction. Based on the relationship, a modified fringe projection method is proposed for unbiased 3D reconstruction. Two experiments, including 3D shape measurement of a white plate with ring markers and 3D shape measurement of a regular spherical object are conducted. The results demonstrate the effectiveness of the proposed method in such measurement environment.

© 2022 Optica Publishing Group under the terms of the Optica Open Access Publishing Agreement

1. Introduction

Non-contact three-dimensional (3D) measurement based on vision systems has been widely used in manufacturing industry [1], biomedical science [2] and cultural relics protection [3], due to its advantages of low cost, nondestructive and high accuracy. Generally, a vision system and a measured object are placed in the uniform medium (such as air), that is, lights propagate in the homogenous medium. Vision measurement methods based on the triangulation principle can well achieve the 3D reconstruction of an object in such measurement condition.

In some special measurement requirements, 3D shape measurement of objects through a transparent observation window in wind tunnel or 3D reconstruction of cultural relics preserved in protective covers, it may be necessary to measure the 3D topography of objects in different media. Because the refraction indices of different media are different, the light reflected by an object is deflected at the interface of two different media, resulting in the deviation of the imaging position of the object on the camera target plane [4,5]. Therefore, using the traditional vision measurement methods [68] will introduce large reconstruction errors. For a correct result, many correction methods have been developed for eliminating the influence of light deflection. These methods mainly contain two ideas: one is to fuse the refraction parameters of media into the camera imaging model for calibration; the other is to calibrate the system parameters and then eliminate the influence of light deflection. The latter is widely used in practice because of its advantages of high precision, easy-to-operation and simple mathematical model. It can be grouped roughly into two types.

The first type is image rectification methods [9,10] that first eliminate the image distortion caused by refraction, and then use the undistorted image to carry out 3D reconstruction. Specifically, by matching the feature points of refracted images and un-refracted images at the same scene, the transformation relationship between distorted points and undistorted points is established. Then the distorted image is transformed into the undistorted image to accurately calculate the 3D data of a measured object. The advantages of the image rectification methods are that they do not need to consider the refractive index of a medium, the position and direction of the refraction interface, and only need to find the corresponding points of refracted images and un-refracted images under the same scene. However, the depth from an object to a camera directly affects the conversion relationship between un-refracted images and refracted images, so feature points must be distributed near the object.

As alternatives to the image rectification methods, geometrical correction methods were proposed to eliminate the effect of light deflection. According to the ray reversal principle, such methods track the ray paths emitted from two vision devices, and then reconstruct the 3D shape of an object. Gong et al. [11] used the ray tracing algorithm to calculate the equations of the rays emitted from the corresponding points on two cameras images after multiple deflections, and then solved the intersection of the two equations to obtain the 3D coordinates, which is the 3D data of an object. Based on the idea of ray tracing, Su et al. [12] used the dual-camera stereo digital image correlation algorithm to search the stereo and temporal correspondence between speckle images. Then, the refracted light paths through air-glass-water were constructed for calculating the 3D data of a measured object. By subtracting the 3D data before and after deformation, 3D displacement distribution can be retrieved. Later, measuring the propeller deformation in the cavitation tunnel with the method successfully reconstructs the full-field 3D displacement distribution of the propeller blades at different time point [13]. However, the method needs to manually make random speckle patterns on the object surface in advance for matching the corresponding points on two camera images. Later, Sun et al. [14,15] discussed in depth the matching problem of corresponding points on underwater left and right camera images, and proposed automatic feature point matching algorithms. Although geometric correction methods work well, two deficiencies remain to be solved. One is the complexity of the calculation process because of the requirement to calculate the ray equations after each deflection. The other is the difficulty of matching corresponding points on the left and right camera images due to refraction effect.

Recently, a simple 3D reconstruction method of underwater objects using geometric relationship between an object point and an imaging point was proposed [16]. This method does not require the complicated processes of calculating the ray equations, but requires the camera optical axis to be parallel to the normal vector of plane medium. Inspired by the idea of using the geometric relationship, a new method for 3D reconstruction by fringe projection system through a planar transparent medium is introduced. We derive the relationship among a pseudo-object point generated by light refraction, a real-object point and an imaging point. Based on the relationship, a modified fringe projection method, which makes camera position flexible, is proposed for reconstructing an unbiased 3D data. The proposed method not only avoids the complicated processes of calculating the ray equations, but also solves the problem of difficult image matching. In the following sections, the principle of the proposed refractive 3D reconstruction method is first described. Then, the effectiveness of this method is validated by 3D shape measurement of a white plate with ring markers at different positions and orientations, as well as 3D shape measurement of a spherical object with known diameter size. Finally, the contributions of our work are summarized.

2. Refractive fringe projection measurement

2.1 Geometry of refractive fringe projection system

The geometry of a refractive fringe projection system is established based on Snell’s law of flat refraction. For simplicity, we use a glass with two parallel surfaces as a planar transparent medium. A fringe projection system composed of a digital micro-mirror device (DMD) projector and a charge coupled device (CCD) camera, and an object point are respectively placed in both sides of the glass. The refractive geometry of the fringe projection system through the glass is as shown in Fig. 1. Here, Op is the optical center of the projector, Oc = [x0, y0, z0]T is the optical center of the camera, []T denotes the transposition, h is the distance from the optical center of the camera to the glass surface, d is the thickness of the glass, na = 1 is the refractive index of air, and ng is the refractive index of the glass.

 figure: Fig. 1.

Fig. 1. Geometry of the refractive fringe projection system.

Download Full Size | PDF

In Fig. 1, the point Pp on the DMD in air projects a light ray onto the glass surface and ends at the real-object point P in air after two refractions. The light reflected by the point P is refracted twice through the interfaces π1 and π2, and then imaged at the point Pc on the CCD, which intersects the π1 and π2 at the points P1 and P2, respectively. The corresponding angles of the refraction are denoted by α, β and γ. Because the interfaces π1 and π2 are parallel, α is equal to γ. To reconstruct the coordinates (xw, yw, zw) of real-object point P, the coordinate transformation relationship between point P’ and point P needs to be established.

To derive this transformation relationship, we define the glass surface as the X-Y plane. The following equations are deduced from the geometrical relationship shown in Fig. 1,

$$\left\{ \begin{array}{c} {{x}_\textrm{w}}={x}^{\prime}\\ {{y}_{w}}= y^{\prime}\\ {{{z}_{w}}=z^{\prime}+}{{d}_{s}} \end{array} \right.$$
where ds is the vertical distance between point P’ and point P. The coordinates (x’, y’, z’) of pseudo-object point P’ are results reconstructed from Pp and Pc with a traditional fringe projection method without considering the influence of refraction.

Next, our goal is to calculate ds. According to the relationships of triangular geometry, we get the following equations

$${\tan\alpha =}\frac{{s}_{1}}{{d}_{s}+{d}_{m}}$$
$${\tan\beta =}\frac{{s}_{2}}{d}$$
$${\tan\gamma =}\frac{{s}_{1}{+ }{s}_{2}}{d+ {d}_{m}}=\frac{{s}_{3}}{h}$$
Based on the law of refraction, we get
$${\tan\beta =}\frac{1}{\sqrt {\left( {\frac{{n}_{g}}{\sin\gamma }} \right)^{2}{ \textrm{-} 1}}}$$

By solving Eqs. (2)–(5), ds can be expressed as:

$${{d}_{s}}=d\left( {1 \textrm{-} }\frac{{\cos\gamma }}{\sqrt {n_{g}^{2}{ \textrm{-} \sin}^{2}{\gamma }}}\right)$$
where,
$${\gamma }{ = \textrm{arctan}}\frac{{\sqrt {{{({{x^{\prime}\textrm{-}}{{x}_{0}}} )}^{2}}{ + }{{({{y^{\prime}\textrm{-}}{{y}_{0}}} )}^{2}}} }}{{{z^{\prime}\textrm{-}}{{z}_{0}}}}$$

Because (x’, y’, z’) are known, ds can be calculated. Note that [x0, y0, z0]T can be acquired with the intrinsic and extrinsic parameters of the system in advance.

Finally, substituting ds in (1), we can obtain the expressions of real-object point P(xw, yw, zw)

$$\left\{\begin{array}{c} {{x}_{w}}{=}{{x}^{\prime}}\\ {{y}_{w}}=y^{\prime}\\ {z}_{w}=z^{\prime}+d\left( {1 \textrm{-}}\frac{\cos\gamma}{\sqrt{{n}_{g}^{2}{{ \textrm{-} sin}^{2}}{\gamma}}} \right)\end{array} \right.$$

If the glass medium is absent, that is, ng = 1 in Eq. (7), then ${{z}_{w}}{=z^{\prime}}$. This means that pseudo-object point P’ coincides with point real-object point P without the effect of light deflection. On the contrary, if the glass medium is present, (xw, yw, zw) can be determined as long as the coordinates (x’, y’, z’) are known. Here, we use a modified fringe projection method for calculating the coordinates (x’, y’, z’) of a pseudo-object point.

2.2 Refractive 3D reconstruction based on fringe projection

A fringe projection system, which consists of a camera and a projector, needs to be calibrated before performing 3D reconstruction. The calibration aims to determine the intrinsic and extrinsic parameters of the system. The intrinsic parameters are related to the internal geometries and optical properties of the system, which remain constant after system calibration. The extrinsic parameters represent the rotations and translations between the world coordinate system and the camera coordinate system, and between the world coordinate system and the projector coordinate system. In general, the defined world coordinate system does not meet the conditions of Eq. (7), that is, the XY plane of the world coordinate system is not coincident with the glass surface. Therefore, the position and direction of the world coordinate system must be adjusted, and the corresponding extrinsic parameters need to be re-calculated for refractive 3D reconstruction.

Figure 2 shows the coordinate system of the refractive fringe projection system. The original world coordinate system (Oo; Xo, Yo, Zo) is established on the ring board for system calibration when the glass is absent. The reference coordinate system (Or; Xr, Yr, Zr) is defined on the glass surface containing N corner points of the checkerboard. The coordinates of these points in the reference and original world coordinates are respectively $({{x}_{r}^{i}{,y}_{r}^{i}{,z}_{r}^{i}} )$ and $({{x}_{o}^{i}{,y}_{o}^{i}{,z}_{o}^{i}} )$, i = 1…, N. The relationship between them can be described as:

$$\left[ {\begin{array}{c} {{x}_{r}^{i}}\\ {{y}_{r}^{i}}\\ {{z}_{r}^{i}} \end{array}} \right]{=}{\mathbf{R}_{n}}\left[ {\begin{array}{c} {{x}_{o}^{i}}\\ {{y}_{o}^{i}}\\ {{z}_{o}^{i}} \end{array}} \right]{+ }{\mathbf{T}_{n}}$$
where Rn and Tn respectively represent the 3 × 3 rotation matrix and the 3 × 1 translation vector from the original world coordinate system to the reference coordinate system. Assume that Rc and Tc denote the rotation matrix and translation vector from the original world coordinate system to the camera coordinate system (Oc; Xc, Yc, Zc), respectively. Rp and Tp denote the rotation matrix and translation vector from the original world coordinate system to the projector coordinate system (Op; Xp, Yp, Zp), respectively. Due to the transformation of the reference position, the corresponding extrinsic parameters need to be changed. Therefore, the rotation matrix and translation vector relating to the reference coordinate system to the camera coordinate system are RnRc and RnTc, respectively. The rotation matrix and translation vector relating to the reference coordinate system to the projector coordinate system are RnRp and RnTp, respectively.

 figure: Fig. 2.

Fig. 2. Coordinate system of the refractive fringe projection system.

Download Full Size | PDF

For simplicity, we move the origin Or of the reference coordinate system to the origin Oc of the camera coordinate system, as shown by the blue lines in Fig. 2, which forms the new reference coordinate system (${O}_{r}^{^{\prime}}$; ${X}_{r}^{^{\prime}}$, ${Y}_{r}^{^{\prime}}$, ${Z}_{r}^{^{\prime}}$). Assume that Xr, Xp, and ${X}_{r}^{^{\prime}}$ are the 3D coordinates of an arbitrary point in the reference coordinate system, the projector coordinate system and the new reference coordinate system, respectively. The relationships among Xr, Xp, and ${X}_{r}^{^{\prime}}$ can be expressed as follows:

$${\mathrm{X}_{p}}{=}{\mathbf{R}_{n}}{\mathbf{R}_{p}}{\mathrm{X}_{r}}{+ }{\mathbf{R}_{n}}{\mathbf{T}_{p}}$$
$$\mathrm{X}_{r}^{\prime}{=\mathbf{I}}{\mathrm{X}_{r}}{- }{{O}_{c}}$$
where I denotes the 3 × 3 identity matrix. The relationship from the new reference coordinate system to the projector coordinate system can be deduced from Eqs. (9) and (10):
$${\mathrm{X}_{p}}{=}{\mathbf{R}_{n}}{\mathbf{R}_{p}}\mathrm{X}_{r}^{\prime}{+ }{\mathbf{R}_{n}}{(}{\mathbf{T}_{p}}{ + }{\mathbf{R}_{p}}{{O}_{c}}{)}$$

The new extrinsic parameters of the projector are RnRp and Rn(Tp + RpOc). Obviously, the new extrinsic parameters of the camera are RnRc and 0, as shown in Fig. 2. It is noteworthy that the camera intrinsic parameters matrix Ac and the projector intrinsic parameters matrix Ap remain unaltered.

Using the new extrinsic and intrinsic parameters of the system, we can calculate the 3D data of pseudo-object point P’. Let Hc = Ac [RnRc, 0], then the transformation from the 3D world coordinate (x’, y’, z’) of P’ to its camera pixel coordinate Pc(uc, vc) can be can be expressed by

$${{s}_{c}}\left[ {\begin{array}{c} {{{u}_{c}}}\\ {{{v}_{c}}}\\ {1} \end{array}} \right]{ = }{\mathbf{H}_{c}}\left[ {\begin{array}{c} {{x^{\prime}}}\\ {{y^{\prime}}}\\ {{z^{\prime}}}\\ {1} \end{array}} \right]{ = }\left[ {\begin{array}{cccc} {{h}_{c}^{{11}}}&{{h}_{c}^{{12}}}&{{h}_{c}^{{13}}}&{0}\\ {{h}_{c}^{{21}}}&{{h}_{c}^{{22}}}&{{h}_{c}^{{23}}}&{0}\\ {{h}_{c}^{{31}}}&{{h}_{c}^{{32}}}&{{h}_{c}^{{33}}}&{0} \end{array}} \right]\left[ {\begin{array}{c} {{x^{\prime}}}\\ {{y^{\prime}}}\\ {{z^{\prime}}}\\ {1} \end{array}} \right]$$
where sc is the scale factor. Similar to Eq. (12), let Hp = Ap [RnRp, Rn(Tp+RpOc)], then the transformation from the 3D world coordinate (x’, y’, z’) of P’ to its projector pixel coordinate Pp(up, vp) can be can be expressed by
$${{s}_{p}}\left[ {\begin{array}{c} {{{u}_{p}}}\\ {{{v}_{p}}}\\ {1} \end{array}} \right]{ = }{\mathbf{H}_{p}}\left[ {\begin{array}{c} {{x^{\prime}}}\\ {{y^{\prime}}}\\ {{z^{\prime}}}\\ {1} \end{array}} \right]{ = }\left[ {\begin{array}{cccc} {{h}_{p}^{{11}}}&{{h}_{p}^{{12}}}&{{h}_{p}^{{13}}}&{{h}_{p}^{{14}}}\\ {{h}_{p}^{{21}}}&{{h}_{p}^{{22}}}&{{h}_{p}^{{23}}}&{{h}_{p}^{{24}}}\\ {{h}_{p}^{{31}}}&{{h}_{p}^{{32}}}&{{h}_{p}^{{33}}}&{{h}_{p}^{{34}}} \end{array}} \right]\left[ {\begin{array}{c} {{x^{\prime}}}\\ {{y^{\prime}}}\\ {{z^{\prime}}}\\ {1} \end{array}} \right]$$
where sp is the scale factor. By removing the scale factors sc and sp in Eqs. (12) and (13), we can get
$$\left[ {\begin{array}{ccc} {{u_c}h_c^{31} \textrm{-} h_c^{11}}&{{u_c}h_c^{32} \textrm{-} h_c^{12}}&{{u_c}h_c^{33} \textrm{-} h_c^{13}}\\ {{v_c}h_c^{31} \textrm{-} h_c^{21}}&{{v_c}h_c^{32} \textrm{-} h_c^{22}}&{{v_c}h_c^{33} \textrm{-} h_p^{23}}\\ {{u_p}h_p^{31} \textrm{-} h_p^{11}}&{{u_p}h_p^{32} \textrm{-} h_p^{12}}&{{u_p}h_p^{33} \textrm{-} h_p^{13}}\\ {{v_p}h_p^{31} \textrm{-} h_p^{21}}&{{v_p}h_p^{32} \textrm{-} h_p^{22}}&{{v_p}h_p^{33} \textrm{-} h_p^{23}} \end{array}} \right]\left[ {\begin{array}{c} {{x^{\prime}}}\\ {{y^{\prime}}}\\ {{z^{\prime}}} \end{array}} \right] = \left[ {\begin{array}{c} 0\\ 0\\ {h_p^{14} \textrm{-} {u_p}h_p^{34}}\\ {h_p^{24} \textrm{-} {v_p}h_p^{34}} \end{array}} \right]$$
(x’, y’, z’) can be obtained as long as matching the pixels (uc, vc) and (up, vp) in Eq. (14). Substituting (x’, y’, z’) into Eq. (7), we can determine the coordinate (xw, yw, zw) of real-object point P. It should be noted that Eq. (14) has four equations with three unknowns. In 3D reconstruction, the only one in the last two equations are valid, relying on the direction of projected fringe patterns. When projecting vertical fringe patterns, the third equation is valid. When projecting horizontal fringe patterns, the fourth equation is selected. Here, we use vertical fringe patterns for establishing the correspondence from camera images to projector images.

A series of vertical sinusoidal fringe patterns are projected onto the surface of an object. A CCD camera captures the reflected fringe images. By using a standard four-step phase-shifting algorithm and the optimum three fringe selection method [17], we can obtain an absolute phase map. Assume φ denotes the absolute phase of pixel Pc(uc, vc) on the CCD image shown in Fig. 1. The position up of the corresponding line on the DMD image with the same absolute phase can be calculated as:

$${{u}_{p}}{ = }\frac{{{W\varphi }}}{{{2\pi T}}}{+ }\frac{{W}}{{2}}$$
where W is the width of the projected fringe patterns, and T is maximum number of the projected fringe.

3. Experiments and results

We established a fringe projection system through a planar transparent glass to test the proposed refractive 3D reconstruction method. Figure 3 is a photograph of the established system, which is composed of a projector (PRO4500, Texas Instruments) with a resolution of 912 × 1140 pixels, a camera (ECO655CVGE, SVS-VISTEK, Germany) with a resolution of 2448 × 2050 pixels, and the planar glass with a thickness of 20 mm and a refractive index of 1.5. Note that the relative position of the camera and the glass is arbitrary, as long as the camera can observe a measured object through the glass clearly. Additionally, a calibration plate with 12 × 9 discrete ring markers on its surface was used for system calibration and accuracy evaluation, as shown in Fig. 3. The distances of each adjacent markers in the horizontal and the vertical directions have the same value of 7.5 mm. For all experiments, a four-step phase-shifting algorithm was used for calculating wrapped phase and the optimum fringe selection method was used for calculating absolute phase at each pixel position.

 figure: Fig. 3.

Fig. 3. Photograph of experimental system and calibration plate.

Download Full Size | PDF

Before test experiments, the system parameters need to be calibrated. When the glass medium was absent, we placed the calibration plate at eleven poses in the measuring volume. At each position, 12 horizontal and 12 vertical sinusoidal fringe patterns were projected by the projector onto the plate surface. From another view, deflected fringe images and a texture image were captured by the camera. All ring centers on the texture image were extracted in camera pixel coordinate and were mapped in projector pixel coordinate with the horizontal and vertical phase maps from the captured fringe images. Then, a world coordinate system for the camera and projector was established. By matching one-to-one pixel-wise between the world coordinate and its projections onto the 2D pixel coordinates, we determined the intrinsic and extrinsic parameters of the system, Ap Rp Ac, and Rc, by using the methods in Refs. [18,19]. When the glass medium was present, we stuck a paper printed on checkerboard pattern to the glass surface. A checkerboard image was captured by the camera and was used for extracting corner points. The coordinate values of these corner points in original world coordinate system were calculated with the obtained system parameters and substituted into the Eq. (8) for determining the parameters Rn and Tn [20]. Eventually, the new system parameters, Hc and Hp, were obtained for the following 3D reconstruction tests.

After calibrating the established system, we measured the same plate to verify the effectiveness of the proposed refractive 3D reconstruction method. The plate was placed in five poses in measuring volume, which are different from above calibration poses because the calibration process minimizes root square error among all calibration poses. At each pose, 12 vertical sinusoidal fringe patterns with optimum fringe numbers of 100, 99, 90 were projected onto the plate surface through the planar glass, and reflected fringe images were captured synchronously for calculating the absolute phase. The obtained phase was used for identifying a line in the DMD image, which corresponds to each pixel point on the CCD image. According to the known system parameters and the established correspondences, the biased 3D shape of the plate was reconstructed for calculating ds at each pixel. Finally, we reconstructed the unbiased 3D shapes of the plate at every pose by invoking Eq. (7). For comparison, we used the traditional fringe projection method [19] for measuring the plate at the same poses.

We calculated the spatial distances of line segments $\overline {{AB}} $ and $\overline {{CD}} $ on the plate, respectively. Line segment $\overline {{AB}} $ is formed by connecting the ring centers A and B, whose theoretical value is 84.853 mm. Similarly, line segment $\overline {{CD}} $ is formed by connecting the ring centers C and D, whose theoretical value is also 84.853 mm. Table 1 shows the measured distances of two line segments with the proposed method and the traditional fringe projection method. For Line segment $\overline {{AB}} $, the mean error (ME) and the standard deviation (SD) are 0.0321mm and 0.0101mm with the proposed method, and 0.4679mm and 0.0787mm with the traditional fringe projection method. The corrected results achieve a 15 fold reduction in the ME and a 8 fold reduction in the SD. For Line segment $\overline {{CD}} $, the ME and the SD are 0.0105mm and 0.0074mm with the proposed method, and 0.3645mm and 0.0564mm with the traditional fringe projection method. The corrected results achieve a 35 fold reduction in the ME and a 8 fold reduction in the SD. The results demonstrate the effectiveness of the proposed refractive 3D reconstruction method.

Tables Icon

Table 1. Measurement results of two lines on the calibration plate

To verify the accuracy of the proposed method, we measured a sphere with a diameter of 38.1 mm. The sphere was placed an appropriate position in measuring volume so that the image of the sphere surface was clearly formed in the camera imaging plane. Twelve vertical sinusoidal fringe patterns with optimum fringe numbers of 100, 99, 90 were projected onto the sphere surface through the planar glass. The distorted fringe images modulated by the sphere surface were refracted by the glass surfaces, and then were imaged on the camera imaging plane synchronously. The phase map from the captured fringe images was used for establishing the point-to-line relationship from camera pixels and projector pixels. By using the known system parameters and the established correspondences, we reconstructed a biased 3D shape of the sphere surface for calculating ds at each pixel. Finally, the biased 3D shape of the sphere surface was converted into the unbiased 3D shape of the sphere surface with the obtained ds. For accuracy comparison, we used the traditional fringe projection method [19] for measuring the 3D shape of the sphere.

We used the reconstructed 3D data to fit an ideal sphere. The difference between the reconstructed result and the fitted sphere was calculated for accuracy evaluation. Figures 4(a) and 5(a) show the ideal sphere and the reconstructed 3D geometries with the traditional fringe projection method and the proposed method, respectively. For better visualization, the corresponding top views are shown in Figs. 4(b) and 5(b). The corresponding error distribution maps are shown in Figs. 4(c) and 5(c). The ME and SD are 0.005 mm and 0.009 mm with the proposed method; and 0.01 mm and 0.017 mm the traditional fringe projection method. The corrected results achieve a 2 fold reduction in the ME and a 2 fold reduction in the SD. To observe the details of error distribution, the cross section in horizontal direction is depicted in Figs. 4(d) and 5(d). It is observed that the errors caused by light deflection mainly occur on the left edge. This is because the light rays from pixel points on the image left edge are greatly deflected. The proposed method decreases the effect of light refraction, and increases measurement accuracy. These results demonstrate that the proposed refractive 3D reconstruction method based on fringe projection is accurate.

 figure: Fig. 4.

Fig. 4. Measurement results of the spherical object with the traditional fringe projection method. (a) overlay of reconstructed 3D geometry with the ideal sphere, (b) top view of (a), (c) error distribution map in (a), (d) cross section in horizontal direction of (c).

Download Full Size | PDF

 figure: Fig. 5.

Fig. 5. Measurement results of the spherical object with the proposed refractive 3D reconstruction method. (a) overlay of reconstructed 3D geometry with the ideal sphere, (b) top view of (a), (c) error distribution map in (a), (d) cross section in horizontal direction of (c).

Download Full Size | PDF

It is worth noting that the ME and SD change little before and after correction for the sphere compared with the plate. The result is attributed to this reason. The ray that is parallel to the normal of a glass and passes the optical center of a camera is not refracted, and the image point of this ray is an undistorted point. when an object is imaged near this point, the image distortion caused by light refraction is small. On the contrary, an object is imaged away from the point, and the image distortion is large. In our experiments, the sphere is imaged near an undistorted point, so the ME and SD change little before and after correction. However, feature points on the plate are far away from the undistorted point, so the ME and SD change greatly before and after correction, which indirectly proves the performance of our method.

4. Conclusions

We propose a 3D reconstruction method by fringe projection system through a planar transparent medium. The method uses phase information for point-to-line matching the corresponding points between projector images and camera images, avoiding the use of epipolar geometry constrains to find the corresponding points between the left and right camera images, thus improving the matching efficiency. Also, the proposed method establishes the transformation relationship between a pseudo-point and a real-object point from geometry perspective. Based on this relationship, a modified fringe projection method is proposed to reconstruct an unbiased 3D data without the complicated processes of calculating the ray equations. Two 3D reconstruction tests before and after refraction correction are presented to show the good performance of the proposed method.

Accurate medium parameters are a prerequisite of foremost importance for accurate refractive 3D reconstruction. However, two key-problems remain to be addressed: i) how to optimize the geometry of refractive fringe projection system if the thickness of a medium is not known precisely and ii) how to increase the measurement accuracy when a medium surface is not strictly flat. These issues will be addressed in future work with the aim of extending the possibility to perform high-accuracy 3D reconstruction for other applications in 3D measurement of organs immersed in formalin liquid [21], 3D measurement of underwater artifacts [22], indoor robots [23].

Funding

National Natural Science Foundation of China (11772225, 52075147); China Postdoctoral Science Foundation (2019M660013, 2021T140043).

Disclosures

The authors declare no conflicts of interest.

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

References

1. Z. Wu, W. Guo, Y. Li, Y. Liu, and Q. Zhang, “High-speed and high-efficiency three-dimensional shape measurement based on Gray-coded light,” Photonics Res. 8(6), 819–829 (2020). [CrossRef]  

2. F. Chadebecq, F. Vasconcelos, R. Lacher, E. Maneas, A. Desjardins, S. Ourselin, T. Vercauteren, and D. Stoyanov, “Refractive Two-View Reconstruction for Underwater 3D Vision,” Int. J. Comput. Vis 128(5), 1101–1117 (2020). [CrossRef]  

3. F. Menna, P. Agrafiotis, and A. Georgopoulos, “State of the art and applications in archaeological underwater 3D recording and mapping,” J. Cult. Herit. 33, 231–248 (2018). [CrossRef]  

4. Q. Zhang, Q. Wang, Z. Hou, Y. Liu, and X. Su, “Three-dimensional shape measurement for an underwater object based on two-dimensional grating pattern projection,” Opt. Laser Technol. 43(4), 801–805 (2011). [CrossRef]  

5. T. Treibitz, Y. Schechner, C. Kunz, and H. Singh, “Flat refractive geometry,” IEEE Trans. Pattern Anal. Mach. Intell. 34(1), 51–65 (2012). [CrossRef]  

6. C. Zuo, S. Feng, L. Huang, T. Tao, W. Yin, and Q. Chen, “Phase shifting algorithms for fringe projection profilometry: A review,” Opt. Lasers Eng. 109, 23–59 (2018). [CrossRef]  

7. J. Xu and S. Zhang, “Status, challenges, and future perspectives of fringe projection profilometry,” Opt. Lasers Eng. 135, 106193 (2020). [CrossRef]  

8. Z. Wu, W. Guo, and Q. Zhang, “Two-frequency phase-shifting method vs. Gray-coded-based method in dynamic fringe projection profilometry: A comparative review,” Opt. Lasers Eng. 153, 106995 (2022). [CrossRef]  

9. M. Haile and P. Ifju, “Application of elastic image registration and refraction correction for non-contact underwater strain measurement,” Strain 48(2), 136–142 (2012). [CrossRef]  

10. D. Samper, J. Santolaria, A. Majarena, and J. Aguilar, “Correction of the refraction phenomenon in photogrammetric measurement systems,” Metrol. Meas. Syst. 20(4), 601–612 (2013). [CrossRef]  

11. Z. Gong, Z. Liu, and G. Zhang, “Flexible method of refraction correction in vision measurement systems with multiple glass ports,” Opt. Express 25(2), 831–847 (2017). [CrossRef]  

12. Z. Su, J. Pan, L. Lu, M. Dai, X. He, and D. Zhang, “Refractive three-dimensional reconstruction for underwater stereo digital image correlation,” Opt. Express 29(8), 12131–12144 (2021). [CrossRef]  

13. Z. Su, J. Pan, S. Zhang, S. Wu, Q. Yu, and D. Zhang, “Characterizing dynamic deformation of marine propeller blades with stroboscopic stereo digital image correlation,” Mech. Syst. Signal Process. 162(10), 108072 (2022). [CrossRef]  

14. H. Sun, H. Du, M. Li, H. Sun, and X. Zhang, “Underwater image matching with efficient refractive-geometry estimation for measurement in glass-flume experiments,” Measurement 152, 107391 (2020). [CrossRef]  

15. H. Sun, H. Du, M. Li, and X. He, “Study on ray-tracing-based 3D reconstruction method for underwater measurement in glass-flume experiments,” Measurement 174, 108971 (2021). [CrossRef]  

16. H. Lin, H. Zhang, Y. Li, H. Wang, J. Li, and S. Wang, “3D point cloud capture method for underwater structures in turbid environment,” Meas. Sci. Technol. 32(2), 025106 (2021). [CrossRef]  

17. Z. Zhang, C. Tower, and D. tower, “Time efficient color fringe projection system for 3D shape and color using optimum 3-frequency interferometry,” Opt. Express 14(14), 6444–6455 (2006). [CrossRef]  

18. Z. Zhang, “A flexible new technique for camera calibration,” IEEE Trans. Pattern Anal. Mach. Intell. 22(11), 1330–1334 (2000). [CrossRef]  

19. S. Zhang and P. Huang, “Novel method for structured light system calibration,” Opt. Eng. 45(8), 083601 (2006). [CrossRef]  

20. C. Chen, N. Gao, and Z. Zhang, “Simple calibration method for dual-camera structured light system,” J. Eur. Opt. Soc-Rapid. 14(1), 23 (2018). [CrossRef]  

21. M. Cassidy, J. Mélou, Y. Quéau, F. Lauze, and J. Durou, “Refractive multi-view stereo,” 2020 International Conference on 3D vision.

22. S. Zhuang, X. Zhang, D. Tu, C Zhang, and L. Xie, “A standard expression of underwater binocular vision for stereo matching,” Meas. Sci. Technol. 31(11), 115012 (2020). [CrossRef]  

23. S. Yoon, T. Choi, and S. Sull, “Depth estimation from stereo cameras through a curved transparent medium,” Pattern. Recogn. Lett. 129, 101–107 (2020). [CrossRef]  

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (5)

Fig. 1.
Fig. 1. Geometry of the refractive fringe projection system.
Fig. 2.
Fig. 2. Coordinate system of the refractive fringe projection system.
Fig. 3.
Fig. 3. Photograph of experimental system and calibration plate.
Fig. 4.
Fig. 4. Measurement results of the spherical object with the traditional fringe projection method. (a) overlay of reconstructed 3D geometry with the ideal sphere, (b) top view of (a), (c) error distribution map in (a), (d) cross section in horizontal direction of (c).
Fig. 5.
Fig. 5. Measurement results of the spherical object with the proposed refractive 3D reconstruction method. (a) overlay of reconstructed 3D geometry with the ideal sphere, (b) top view of (a), (c) error distribution map in (a), (d) cross section in horizontal direction of (c).

Tables (1)

Tables Icon

Table 1. Measurement results of two lines on the calibration plate

Equations (16)

Equations on this page are rendered with MathJax. Learn more.

{ x w = x y w = y z w = z + d s
tan α = s 1 d s + d m
tan β = s 2 d
tan γ = s 1 + s 2 d + d m = s 3 h
tan β = 1 ( n g sin γ ) 2 - 1
d s = d ( 1 - cos γ n g 2 - sin 2 γ )
γ = arctan ( x - x 0 ) 2 + ( y - y 0 ) 2 z - z 0
{ x w = x y w = y z w = z + d ( 1 - cos γ n g 2 - s i n 2 γ )
[ x r i y r i z r i ] = R n [ x o i y o i z o i ] + T n
X p = R n R p X r + R n T p
X r = I X r O c
X p = R n R p X r + R n ( T p + R p O c )
s c [ u c v c 1 ] = H c [ x y z 1 ] = [ h c 11 h c 12 h c 13 0 h c 21 h c 22 h c 23 0 h c 31 h c 32 h c 33 0 ] [ x y z 1 ]
s p [ u p v p 1 ] = H p [ x y z 1 ] = [ h p 11 h p 12 h p 13 h p 14 h p 21 h p 22 h p 23 h p 24 h p 31 h p 32 h p 33 h p 34 ] [ x y z 1 ]
[ u c h c 31 - h c 11 u c h c 32 - h c 12 u c h c 33 - h c 13 v c h c 31 - h c 21 v c h c 32 - h c 22 v c h c 33 - h p 23 u p h p 31 - h p 11 u p h p 32 - h p 12 u p h p 33 - h p 13 v p h p 31 - h p 21 v p h p 32 - h p 22 v p h p 33 - h p 23 ] [ x y z ] = [ 0 0 h p 14 - u p h p 34 h p 24 - v p h p 34 ]
u p = W φ 2 π T + W 2
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.