Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Three-dimensional reconstruction of polarized ambient light separation in complex illumination

Open Access Open Access

Abstract

In current research, it is still a hot topic for 3D reconstruction under complex illumination. This paper uses a polarization camera combined with a coding technique to propose a new 3D reconstruction method for polarized ambient light separation. Based on the polarization camera, a specific separation model is established to analyze the relationship between the polarization characteristics of polarized and natural light. Specular reflections were filtered first and then analyzed based on the stocks vector and muller matrix. A specific calculation process was used to calculate different polarization azimuths according to the polarization characteristics, and finally, the polarized light and ambient light were separated. The experimental results show that the use of this polarization camera approach reduces the number of steps required to rotate the polarizer multiple times. This not only reduces the shooting time but also improves the efficiency. Moreover, after separating the ambient light, polarization imaging suppresses the interference of the ambient light, which helps to highlight the complete point cloud image more clearly in the 3D reconstruction. The standard deviation of 3D reconstruction was improved to 0.1675 mm by using this method in indoor and outdoor experiments.

© 2024 Optica Publishing Group under the terms of the Optica Open Access Publishing Agreement

1. Introduction

In the fringe projection active vision three-dimensional reconstruction method, the surface structured light three-dimensional measurement technology can quickly obtain full-field three-dimensional data [16]. It has the characteristics of full-field measurement, fast speed, high precision, and dense data. It is widely used in high-precision and complex parts measurement. When the camera images the object, most of the highly reflective objects are interfered with by the reflection and refraction of diffuse light for camera imaging, and the reflected light presents a virtual image on the surface of the object, including information such as the color and details of the object itself [712]. When the intensity of the incident light source is vigorous, a highlight area is formed on the surface of the object, which affects the image quality. In the three-dimensional measurement of HDR, the problem of ambient light interference has always been a hot issue to be solved. It reduces the accuracy of the existing fringe projection active vision three-dimensional reconstruction technology and even leads to measurement failure. Therefore, there is an urgent need to innovate new theories and methods for paint-free 3D imaging of complex high glossy profiles, which has become a fundamental measurement problem in the intelligent manufacturing of high-end equipment.

In this problem, Zhao et al.'s direct illumination matching method based on epipolar constraint relies on the nonpolar dominant assumption that the light propagating in the polar plane only contributes to the direct illumination but not complex illumination [13]. Therefore, this method can't solve the problem of a large number of complex lights in the polar plane. Xu et al.'s spatial frequency response method based on subsurface scattered light overcomes the influence of subsurface scattered light by projecting low-frequency fringes [14]. However, this method faces the problem of accuracy reduction and cannot be applied to high-precision measurement occasions. Therefore, this paper proposes a method to separate the angle from the polarized ambient light.

Existing reflection separation algorithms can be divided into the following two types: The first one is a separation algorithm based on image features. In the mixed image, the reflective component usually has some characteristics different from the non-reflective component, and the separated reflective component is re-projected into the original image space to obtain the image of the reflective component. This method mainly depends on the process of image feature extraction, representation, and separation to achieve the accurate extraction of reflective components. Reflection separation based on image features alone can not achieve the desired effect. The second is separation algorithms based on physical properties, such as the polarization properties generated during the reflection of light waves. The reflected light on the surface of an object has a significant polarization effect [1518]. Therefore, polarization characteristics can be used to achieve reflected light separation. In his early days, Wolff studied the reflection separation technique based on polarization properties. Wolff et al. [19] used the Fresnel reflectivity of reflected light to separate the specular reflected light component and the diffuse reflected light component. The experimental results show that, compared with the traditional reflection separation method based on color information, the reflection separation method based on polarization information can separate not only the reflected light from the surface of the medium but also the reflected light from the surface of the metal material. Nayar [20] used a combination of color information and polarization to separate the reflected components. However, overlapping occurs for some highlight regions. Salahieh [21] proposed a multi-polarization fringe projection system, which eliminates image saturation points and enhances the contrast of the fringe by selecting the channel formed by the appropriate polarization azimuth. Not all polarized light reflected from the surface needs to be filtered out, so the angle of the polarizer needs to be rotated, which is more time-consuming than adjusting the exposure time. Feng [22] places a polarizer in front of the projector and places the polarizer perpendicular to the direction of the polarizer in front of the projector in front of the camera. The multiple exposure time is automatically selected using the gray value range to avoid repeatedly adjusting the polarizer. However, this method still can not eliminate the limitation of multiple exposures.

Based on predecessors [2325], this paper uses the polarization characteristics of polarized and ambient light to establish the polarization separation imaging model under complex lighting conditions. It proposes a polarization ambient light separation method. According to the distribution relationship between the reflected ambient light and polarized light in the vertical and parallel directions, orthogonal separation is carried out, and the four polarization azimuth angles of 0 °, 45 °, 90 °, and 135 ° are calculated differently. A specific algorithm is designed to separate the available polarized light intensity and ambient light received at each angle so that the imaged picture can suppress the influence of ambient light. Using a polarizing camera reduces the number of times to turn to the polarizer and improves efficiency and robustness. Therefore, the complete point cloud of 3D reconstruction will be more prominent. The rest of the paper is organized as follows: Section 2 presents the principles; experiments and discussion are given in Section 3; conclusions are explained in Section 4.

2. Principles

2.1 Composition of diffusely reflected light from object surfaces

The diffusely reflected light received by the camera from the surface of the object mainly consists of polarized light ${I_\textrm{O}}$ and natural light ${I_\textrm{N}}$, as shown in Fig. 1. That is, the total amount of light intensity ${I_\textrm{D}}$ received is:

$${{I_\textrm{D}} = {I_\textrm{O}} + {I_\textrm{N}}}$$

However, light reflected from the surface of an object is partially polarized and can be broken down into the sum of the light intensity perpendicular to the direction of the incident surface and the light intensity parallel to the direction of the incident surface:

$${{I_\textrm{D}} = I_\textrm{D}^ \bot + I_\textrm{D}^\parallel } $$
where $I_\textrm{D}^ \bot $ and $I_\textrm{D}^\parallel$ represent light intensity components perpendicular and parallel to the reflected light, respectively.

 figure: Fig. 1.

Fig. 1. Reflected light received by the camera.

Download Full Size | PDF

So polarized light ${I_\textrm{O}}$ and natural light ${I_\textrm{N}}$ can be decomposed by orthogonal decomposition as:

$${\left\{ {\begin{array}{{l}} {{I_\textrm{O}} = I_\textrm{O}^ \bot + I_\textrm{O}^\parallel }\\ {{I_\textrm{N}} = I_\textrm{N}^ \bot + I_\textrm{N}^\parallel } \end{array}} \right.} $$

Thus the total diffuse reflectance component ${I_\textrm{D}}$ of the object surface can be expressed as follows:

$${\left\{ {\begin{array}{{l}} {I_\textrm{D}^ \bot = I_\textrm{O}^ \bot + I_\textrm{N}^ \bot }\\ {I_\textrm{D}^\parallel{=} I_\textrm{O}^\parallel{+} I_\textrm{N}^\parallel } \end{array}} \right.} $$

Figure 2 below shows the light intensity distribution of diffusely reflected light on the surface of the object. As can be seen from Eq. (4) and the simulation result Fig. 2, the total diffusely reflected light and the diffusely polarized light have the same polarization direction.

 figure: Fig. 2.

Fig. 2. Distribution of light intensity received by the camera.

Download Full Size | PDF

2.2 Stocks description of polarized light

The superposition of incoherent light using Stocks is written as ${[{{S_0},{S_1},{S_2},{S_3}} ]^T}$, where the Stokes parameter has the following relation to the perpendicular s-component amplitude ${E_s}$, the parallel p-component amplitude ${E_p}$, and the phase difference δ in the light vector:

$${\left\{ {\begin{array}{{l}} {{S_0} = E_p^2 + E_s^{2\; \; \; }\; \; \; \; \; }\\ {{S_1} = E_p^2 - E_s^2\; \; \; \; \; \; \; }\\ {\begin{array}{{l}} {{S_2} = 2{E_p}{E_s}\cos \delta }\\ {{S_3} = 2{E_p}{E_s}\sin \delta } \end{array}} \end{array}} \right.} $$
where ${S_0}$ represents the total irradiance (light intensity) of the beam; ${S_1}$ represents the intensity of the linearly polarized component of the beam in the 0-degree direction; ${S_2}$ represents the intensity of the linearly polarized component of the beam in the direction of the oblique 45-degree direction; and ${S_3}$ represents the intensity of the circularly polarized component of the beam.

According to the above Eq. (5), the total polarization ${P_w}$ can be expressed as:

$${P_w} = \frac{{\sqrt {S_1^2 + S_2^2 + S_3^2} }}{{{S_0}}}$$

According to solid-state physics, the Miller matrix M of a dielectric surface can be expressed as:

$${M_0} = \frac{1}{2}{\left( {\frac{{tan{\theta_ - }}}{{tan{\theta_ + }}}} \right)^2}\left[ {\begin{array}{{ccc}} {co{s^2}{\theta_ - } + co{s^2}{\theta_ + }}&{co{s^2}{\theta_ - } - co{s^2}{\theta_ + }}&{\begin{array}{{ccc}} 0&{{\; \; \; \; \; \; \; \; \; \; \; \; \; \; \; \; \; \; \; \; \; \; \; \; \; \; \; }0} \end{array}}\\ {co{s^2}{\theta_ - } - co{s^2}{\theta_ + }}&{co{s^2}{\theta_ - } + co{s^2}{\theta_ + }}&{\begin{array}{{ccc}} 0&{{\; \; \; \; \; \; \; \; \; \; \; \; \; \; \; \; \; \; \; \; \; \; \; \; \; \; \; }0} \end{array}}\\ {\begin{array}{{c}} 0\\ 0 \end{array}}&{\begin{array}{{c}} 0\\ 0 \end{array}}&{\begin{array}{{cc}} { - 2cos{\theta_ - }cos{\theta_ + }}&{{\; }0}\\ 0&{ - 2cos{\theta_ - }cos{\theta_ + }} \end{array}} \end{array}} \right]$$

In the above Eq. (7), ${\theta _ \pm } = {\theta _{incident}} \pm {\theta _{refractive}}$, ${\theta _{incident}}$ is the angle of incidence of the light ray and ${\theta _{refractive}}$ is the angle of refraction of the light ray.

So when $\vec{A}$ incident light is horizontally and vertically linearly polarized, there is:

$$\scalebox{0.9}{$\begin{aligned} \vec{B} &= {M_0}\vec{A}\; \\ &= \frac{1}{2}{\left( {\frac{{\tan {\theta_ - }}}{{\tan {\theta_ + }}}} \right)^2}\left[ {\begin{array}{@{}cccccc@{}} {{{\cos }^2}{\theta_ - } + {{\cos }^2}{\theta_ + }}&{{{\cos }^2}{\theta_ - } - {{\cos }^2}{\theta_ + }}&0&0\\ {{{\cos }^2}{\theta_ - } - {{\cos }^2}{\theta_ + }}&{{{\cos }^2}{\theta_ - } + {{\cos }^2}{\theta_ + }}&0&0\\ 0&0&{ - 2\cos {\theta_ + }\cos {\theta_ - }}&0\\ 0&0&0&{ - 2\cos {\theta_ + }\cos {\theta_ - }} \end{array}} \right]\left[ {\begin{array}{@{}c@{}} 1\\ { \pm 1}\\ 0\\ 0 \end{array}} \right]\\ &= \frac{1}{2}{\left( {\frac{{\tan {\theta_ - }}}{{\textrm{tan} + }}} \right)^2}\left[ {\begin{array}{@{}c@{}} {2{{\cos }^2}{\theta_ - }}\\ {2{{\cos }^2}{\theta_ - }}\\ 0\\ 0 \end{array}} \right] = {\left( {\frac{{\tan {\theta_ - }}}{{\tan {\theta_ + }}}} \right)^2}{\cos ^2}{\theta _ - }\left[ {\begin{array}{{c}} 1\\ { \pm 1}\\ 0\\ 0 \end{array}} \right]\; \end{aligned}$}$$

2.3. Establishment of polarization separation imaging model under complex illumination

The polarization imaging model under complex illumination conditions is:

$${\left\{ {\begin{array}{{c}} {{I_\textrm{O}} = \frac{1}{{{P_O} - {P_N}}}[{{I_{\textrm{min}}}({1 + {P_O}} )- {I_{\textrm{max}}}({1 - {P_N}} )} ]}\\ {{I_\textrm{N}} = \frac{1}{{{P_O} - {P_N}}}[{{I_{\textrm{max}}}({1 - {P_O}} )- {I_{\textrm{min}}}({1 + {P_N}} )} ]} \end{array}} \right.} $$
where: ${I_{\textrm{max}}}$ is the maximum light intensity map obtained during polarization detection; ${I_{\textrm{min}}}$ is the minimum light intensity map obtained during polarization detection; ${P_N}$ and ${P_O}$ is the degree of polarization of ambient natural light and target information light, respectively. It can be seen from Eq. (9) above that the key to suppressing ambient light in complex lighting environments is to obtain the degree of polarization ${P_N}$ and ${P_O}$.

Using a piece of polarizer placed in front of the projector and a polarizing camera, pictures are taken in a direction perpendicular to that of the polarizer in front of the projector. Thus, the light projected onto the LRR object's surface is polarized. Therefore, most of the light reflected from the surface of the mirror-like surface is filtered out, so the camera captures only the diffuse component. Therefore, the calculation of the degree of polarization of the background ambient light in the active polarization imaging model can select a non-target area and use Eq. (10) to calculate:

$${{P_N} = \frac{{I_\textrm{N}^\parallel{-} I_\textrm{N}^ \bot }}{{I_\textrm{N}^\parallel{+} I_\textrm{N}^ \bot }}} $$
where $I_\textrm{N}^\parallel $ and $I_\textrm{N}^ \bot $ are the intensity of the horizontal and vertical components of the natural light in the acquired polarizer image.

Since natural light has the same value of light intensity in all directions:

$${I_\textrm{N}^\parallel{=} I_\textrm{N}^ \bot = \frac{1}{2}{I_\textrm{N}}} $$

So according to Eq. (10) and Eq. (11), it is obtained:

$${{P_N} = 0} $$

The definition of DOLP can be expressed as follows:

$${DOLP = \frac{{{{I^{\prime}}_{max}} - {{I^{\prime}}_{min}}}}{{{{I^{\prime}}_{max}} + {{I^{\prime}}_{min}}}}} $$

Here ${I^{\prime}_{max}}$, ${I^{\prime}_{min}}$ are the maximum and minimum light intensity passing through the polarizer at orthogonal azimuthal angle, respectively. DOLP takes the value in the range of [0, 1].

Furthermore, for an ideal camera imaging system, ignoring imaging noise, the grey value of a pixel can be expressed as:

$${{I_g}({x,y;t} )= \alpha \rho t \cdot {L_p}} $$
where $\alpha$ is the camera sensitivity, $\rho$ is the surface reflectance, t is the exposure time, ${L_p}$ is the projected light intensity, and $I_g$(x,y) is the grey value of the corresponding pixel of the camera in the image coordinate system. Note that the projected light ${L_p}$ is polarized light. Using Eq. (13) and Eq. (14) we obtain.
$$\begin{aligned} {P_O}({x,y} )&= \frac{{{{I^{\prime}}_{max}}({x,y} )- {{I^{\prime}}_{min}}({x,y} )}}{{{{I^{\prime}}_{max}}({x,y} )+ {{I^{\prime}}_{min}}({x,y} )}}\\ &= \frac{{\alpha \rho t \cdot ({{L_{max}}({x,y} )- {L_{min}}({x,y} )} )}}{{\alpha \rho t \cdot ({{L_{max}}({x,y} )+ {L_{min}}({x,y} )} )}} = \frac{{{L_{max}}({x,y} )- {L_{min}}({x,y} )}}{{{L_{max}}({x,y} )+ {L_{min}}({x,y} )}} \end{aligned}$$

${L_{max}}$ denotes the maximum light intensity of the light projected at the horizontal azimuthal angle. ${L_{min}}$ is the minimum light intensity of the projected light at the horizontal azimuthal angle. Equation (15) indicates that the DOLP image is independent of the sensitivity of the camera and the reflectivity of the surface, as well as the exposure time.

According to Eq. (6), Eq. (8) and Eq. (13) above, the total polarization degree ${P_w}$ can be obtained directly from the polarizing camera, so it is obtained:

$${\left\{ {\begin{array}{{l}} {{I_{max}} + {I_{min}} = co{s^2}\theta }\\ {{I_{max}} - {I_{min}} = co{s^2}\theta {P_w}} \end{array}} \right.}$$
$${\left\{ {\begin{array}{{l}} {{I_{\textrm{max}}} = \frac{1}{2} \times co{s^2}\theta ({1 + {P_w}} )}\\ {{I_{\textrm{min}}} = \frac{1}{2} \times co{s^2}\theta ({1 - {P_w}} )} \end{array}} \right.}$$

By substituting Eqs. (12), (15), and (17) into Eq. (9), it can establish polarization imaging model in complex lighting environment, effectively separate target information light and background ambient light, suppress ambient light interference, and make the obtained pixel information more complete.

3. Experiments

3.1 Adoption of experimental equipment

Experiments show the effectiveness of the above methods. As shown in Fig. 3, the experimental system mainly comprises a liquid crystal projector (PT-X303-C-XGA), a polarizer, and a polarizing camera (BFS-U3-51S5P-C). The measuring object is a dielectric metal fragment with a relatively smooth surface to ensure strong stability of the polarized light. The distance from the projector-camera pair to the object is set to less than 2000 mm to ensure the accuracy of the results. The optical center distance between the projector and the camera should be relatively small. A polarizer is added in front of the projector to eliminate specular reflection. Since this manuscript investigates the 3D reconstruction technique for the separation of polarized and ambient light of an object, this experiment was performed in the presence of ambient light. The focal length of the lens mounted on the camera was set to be 16 mm, and the aperture of the camera was f/8. A polarization camera uses the polarization property of light waves to capture additional information about objects in the scene and can detect and distinguish the light polarized along different directions. According to these, it can be oriented at different polarization angles (0 °, 45 °, 90 ° and 135 °). This method allows the polarization camera to capture images in four directions simultaneously, reducing the shooting time.

 figure: Fig. 3.

Fig. 3. Experimental equipment.

Download Full Size | PDF

The measurement process is as follows:

  • (1) According to the above Fig. 3 installed equipment, polarization camera, line polarizer, projector are placed close to the horizontal, the use of monocular inverse camera calibration method calibration of the camera and projector, to get a better calibration parameters.
  • (2) Project a pair of polarization fringe patterns onto the object with an LCD projector, adjust the angle of the transmittance axis of the linear polarizer to coincide with the horizontal direction, and make the camera capture as much as possible the full view of the object under test.
  • (3) At the same time, the camera sequentially photographs the polarization fringe patterns modulated by the object. Segmentation and polarized light ambient light separation are performed on the captured polarization fringe patterns of the four polarization states.
  • (4) According to the image obtained after the separation in step (3). Decoding and three-dimensional reconstruction of polarization fringe patterns using complementary Gray codes combined with a four-step phase-shift method.

3.2 Analysis of experimental results

The parameters of the camera and projector were first calibrated synchronously using the monocular inverse camera calibration method. The calibration fiducials used are 11*9 arrays of white circles with a black background color. Five of the large circles are marker circles to determine the direction, and the remaining small circles are special diagnostic circles. The calibration plate is placed in front of the measurement system and the area of the calibration plate is more than half of the camera's field of view. The optical center distance between the projector and the camera should be relatively small. Reprojection error is the main indicator of good or bad calibration results. The calibration results are shown in Fig. 4. The average reprojection error for the camera is 0.03 pixels, and the reprojection error for the projector is 0.08 pixels. The calibration results have high accuracy and are acceptable. The internal and external parameters of the camera are shown in Table 1 below.

 figure: Fig. 4.

Fig. 4. Calibration results. (a) Reprojection error for camera calibration; (b) calibration plate pose for camera viewpoint; (c) reprojection error for the projector; (d) calibration plate position for projector viewing angle.

Download Full Size | PDF

Tables Icon

Table 1. Calibration result of the camera and the projector

Interference from ambient light is a severe problem in 3D measurements. In most measurement environments, the source of interference is sunlight or artificial light used for illumination, which is unpolarized. In 3D measurements, the light field formed by mixing non-polarized interference with projection-encoded structured light produces serious noise. The separation method proposed in this paper takes advantage of the polarization properties of light to distinguish the reflected structured light from the ambient light, resulting in a change in the received light intensity and more accurate and robust measurement results.

As an example of structured light projection on the surface of a metal object, the metal object was placed approximately 1500 mm away from the projector-camera pair. The LCD projector and camera were parallel to the metal object. The coded streak map was projected using the projector and photographed with a polarizing camera.

The obtained images are shown in Fig. 5. After the separation operation on the original image, it is evident that the fully polarized image suppresses the high light oversaturated region more than the original image. On the other hand, the separated unpolarized light images had some blurring in the imaging. Therefore, subsequent 3D reconstruction using the separated polarized images is more accurate, and more point clouds are obtained.

 figure: Fig. 5.

Fig. 5. Images taken (a) original image; (b) polarized image after separation; (c) ambient light image.

Download Full Size | PDF

The standard deviation of an image indicates the degree of variation in the pixel intensity of the image. A high standard deviation means that the pixel intensity distribution of the image is highly polarized, making the image appear sharper and more layered. So, to further analyze the advantage of polarization environment separation, we have carried out the contrast analysis of the image. The contrast of the image is expressed as:

$${C = \mathop \sum \limits_\delta \delta {{({i,j} )}^2}{P_\delta }({i,j} )} $$
where $\delta ({i,j} )= |{i - j} |$ denotes the absolute difference between the grey values of the adjacent pixels and ${P_\delta }({i,j} )$ is the adjacent pixels whose pairs of adjacent pixels have the same value of δ. ${P_\delta }({i,j} )$ is computed as follows, firstly, the total number of grey values of all the adjacent pairs of pixels in the image is counted as N, which is determined by the resolution of the image. Next, the sum $\delta {({i,j} )^2}$ of the squares of the absolute values of the grey level differences of all adjacent pixel pairs is divided by the total number N. For the boundary pixels of the image, the pixel-by-pixel copy method is adopted for processing. The duplicate gray value is the same as the adjacent internal pixels, ensuring that the pixels at the edge of the image are also considered when calculating the probability. But there will be no additional increase in the total amount of and. This method can ensure that the gray difference of adjacent pixels of the whole image can be effectively counted. As can be seen from Table 2, the image contrast after polarization ambient light separation is twice that of the image before separation. Therefore, the contrast of the fringes in the image after polarization ambient light separation is significantly enhanced, and more information about the fringes is obtained.

Tables Icon

Table 2. Calculated contrast between polarized ambient light separation and conventional polarized structured light

We experimentally validate this method by taking the 3D reconstruction effect under dark conditions as the actual value, performing 3D reconstruction of the labeled volume blocks, and then validating it by simulation error detection.

It can be seen from Fig. 6 that the simulation error under the plane point cloud index is between 0.02 mm through the simulation verification experiment. This shows that the effect of 3D reconstruction in light and dark is similar, indicating that the principle of this method is feasible and has high accuracy and robustness. The experimental results provide a practical reference for the follow-up research work and prove this method's application prospect in theory and practice.

 figure: Fig. 6.

Fig. 6. Test plots in light and dark conditions. (a) Images taken in the light condition; (b) images taken in the dark condition; (c) planar 3D map in a light environment; (d) planar 3D map in a dark environment; (e) errors in both maps.

Download Full Size | PDF

We further verify the universality of the experiment. Many experiments have been carried out in this paper. As shown in Fig. 7 and Fig. 8, we scan plane metal objects at exposure time t = 20000 µ s and metal cylinders at exposure time t = 35000 µ s.

 figure: Fig. 7.

Fig. 7. 3D reconstruction of planar metal objects before and after separation.

Download Full Size | PDF

 figure: Fig. 8.

Fig. 8. 3D reconstruction of the metal cylinder before and after separation.

Download Full Size | PDF

Based on the reconstruction experimental results in Fig. 7 and Fig. 8, we can conclude that the method is suitable for some regular and straightforward objects and performs well for the 3D measurement of irregular and complex highly dynamic metal objects. It is worth mentioning that the method has the advantages of robustness and completeness for the measurement of highly reflective surfaces, which is especially outstanding under the demand of industrial high-speed measurement. We can continuously optimize and enhance the algorithmic model for future improvements to achieve more accurate and reliable measurement results.

In addition, the model is also tested on polarization images taken under more complex lighting conditions (such as cloudy days).

The above two experiments, Fig. 9 and Fig. 10 depict the application of the mentioned 3D reconstruction model in an open outdoor environment. The model showed considerable stability despite facing difficult conditions such as changes in sunlight, different angles of incident light, and complex backgrounds. However, due to the complexity of the outdoor environment, especially when faced with surfaces with strong light reflections. The model still has the challenge of capturing all the detailed texture information. This is especially true for strongly reflective parts of the object's surface, such as smooth metal surfaces, highlight areas caused by direct sunlight, or highlights reflected from wet surfaces. These situations can lead to missing texture details in the 3D reconstruction results. Despite these challenges, the experimental results also show the potential of this 3D reconstruction method in filtering out disturbances caused by ambient light.

 figure: Fig. 9.

Fig. 9. 3D reconstruction of planar metal objects in indoor and outdoor environments.

Download Full Size | PDF

 figure: Fig. 10.

Fig. 10. 3D reconstruction of metal cylinders in indoor and outdoor environments.

Download Full Size | PDF

In the field of 3D reconstruction, accuracy is one of the essential indicators to measure reconstruction quality. As shown in Fig. 11, through the analysis of plane fitting error, we have obtained a specific quantitative index: the standard deviation of 3D reconstruction using this method is 0.1675 mm. This value provides us with an intuitive way to measure the accuracy of the reconstruction method and reflects the consistency between the reconstruction results and the actual physical objects. The standard deviation value shows that the 3D reconstruction method can resist ambient light interference to a certain extent to provide higher reconstruction accuracy. The effective suppression of ambient light interference means that the method is robust to changing lighting conditions, which is very important in practical applications, especially in scenes where the external lighting conditions are not controlled.

 figure: Fig. 11.

Fig. 11. Plot of plane fitting error analysis.

Download Full Size | PDF

With the validation of multiple sets of experiments conducted as described above, based on the successful establishment of a polarized ambient light separation model. This indicates that the technique can achieve accurate 3D reconstruction under challenging lighting conditions. In industrial testing and inspection, it is crucial to accurately and quickly acquire 3D surface information of the object under test, especially in locations highly affected by ambient light. Traditional 3D reconstruction methods are often disturbed by ambient light, resulting in inaccurate data acquisition. Either they need to operate under specific light control conditions, increasing the complexity and cost of inspection. Applying this polarization model reduces the interference of ambient light by effectively separating polarized and unpolarized light. Such a technological breakthrough makes the entire inspection process more reliable and robust. High-accuracy measurements can be maintained even under natural light or in other complex lighting environments, which improves industrial automation and boosts productivity. In addition, this approach can also be applied to areas where accuracy is critical, such as precision manufacturing, artifact restoration, and medical imaging. In these areas, detection equipment must provide accurate and precise images or data under uncontrolled light conditions for subsequent analysis or processing.

4. Summary

The realization of 3D reconstruction in complex environments is still of great interest in the current research. This study proposes a new separation model by investigating the polarization properties between polarized and natural light. This model firstly excludes the influence of specular reflection, then makes a detailed analysis using the Stocks vector and the Muller matrix, accurately calculates different polarization azimuths based on the unique properties of polarization, and then successfully separates the polarized light from the ambient light. The experimental results show that the method not only shortens the image acquisition time and improves the work efficiency but also further enhances the accuracy of 3D reconstruction based on the polarized imaging after separating the ambient light, and the simulation error under the plane point cloud index is between 0.02 mm through the simulation verification experiment. The standard deviation of 3D reconstruction with the method is enhanced to 0.1675 mm, which produces more point cloud data. In conclusion, the polarization separation model based on multiple sets of experimental validation brings new possibilities to the field of 3D reconstruction, making its application in various complex lighting environments a reality.

Funding

Jiangxi Province 03 Special Project (20212ABC03A20, 20232ABC03A03, 20232ABC03A08); Key Research and Development Program of Jiangxi Province (20232BBE50012); National Natural Science Foundation of China (52065024).

Disclosures

The authors declare no conflicts of interest.

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

References

1. J. Salvi, S. Fernandez, T. Pribanic, et al., “A state of the art in structured light patterns for surface profilometry,” Pattern Recognition 43(8), 2666–2680 (2010). [CrossRef]  

2. Ji Tan, Zhaoshui He, Wenqing Su, et al., “Robust fringe projection measurement based on reference phase reconstruction,” Opt. Lasers Eng. 147, 106746 (2021). [CrossRef]  

3. Xin Tian, Rui Liu, Zhongyuan Wang, et al., “High quality 3D reconstruction based on fusion of polarization imaging and binocular stereo vision,” Information Fusion 77, 19–28 (2022). [CrossRef]  

4. Jiarui Zhang, Yingjie Zhang, Bo Chen, et al., “Full-field phase error analysis and compensation for nonsinusoidal waveforms in phase shifting profilometry with projector defocusing,” Opt. Commun. 430, 467–478 (2019). [CrossRef]  

5. Wantao He, Kai Zhong, Zhongwei Li, et al., “Accurate calibration method for blade 3D shape metrology system integrated by fringe projection profilometry and conoscopic holography,” Opt. Lasers Eng. 110, 253–261 (2018). [CrossRef]  

6. Hongzhi Jiang, Yuxi Li, Huijie Zhao, et al., “Parallel single-pixel imaging: A general method for direct–global separation and 3d shape reconstruction under strong global illumination,” Int. J. Comput. Vis. 129(4), 1060–1086 (2021). [CrossRef]  

7. Shijie Feng, Chao Zuo, Tianyang Tao, et al., “Robust dynamic 3-D measurements with motion-compensated phase-shifting profilometry,” Opt. Lasers Eng. 103, 127–138 (2018). [CrossRef]  

8. Song Zhang, “High-speed 3D shape measurement with structured light methods: A review,” Opt. Lasers Eng. 106, 119–131 (2018). [CrossRef]  

9. Song Zhang, “Recent progresses on real-time 3D shape measurement using digital fringe projection techniques,” Opt. Lasers Eng. 48(2), 149–158 (2010). [CrossRef]  

10. Sam Van der Jeught and Joris JJ Dirckx, “Real-time structured light profilometry: a review,” Opt. Lasers Eng. 87, 18–31 (2016). [CrossRef]  

11. Kaiqiang Wang, Ying Li, Qian Kemao, et al., “One-step robust deep learning phase unwrapping,” Opt. Express 27(10), 15100–15115 (2019). [CrossRef]  

12. G. E. Spoorthi, Subrahmanyam Gorthi, and Rama Krishna Sai Subrahmanyam Gorthi, “PhaseNet: A deep convolutional neural network for two-dimensional phase unwrapping,” IEEE Signal Process Lett. 26(1), 54–58 (2019). [CrossRef]  

13. Huijie Zhao, Yang Xu, Hongzhi Jiang, et al., “3D shape measurement in the presence of strong interreflections by epipolar imaging and regional fringe projection,” Opt. Express 26(6), 7117–7131 (2018). [CrossRef]  

14. Yang Xu, Huijie Zhao, Hongzhi Jiang, et al., “High-accuracy 3D shape measurement of translucent objects by fringe projection profilometry,” Opt. Express 27(13), 18421–18434 (2019). [CrossRef]  

15. Zhenmin Zhu, Yulin Xie, and Yigang Cen, “Polarized-state-based coding strategy and phase image estimation method for robust 3D measurement,” Opt. Express 28(3), 4307–4319 (2020). [CrossRef]  

16. Zhenmin Zhu, Duoduo You, Fuqiang Zhou, et al., “Rapid 3D reconstruction method based on the polarization-enhanced fringe pattern of an HDR object,” Opt. Express 29(2), 2162–2171 (2021). [CrossRef]  

17. Zhenmin Zhu, Yawen Dong, Duoduo You, et al., “Accurate three-dimensional measurement based on polarization-defocused encoded structured light,” Measurement 205, 112128 (2022). [CrossRef]  

18. Sheng Wang, “Suppression method for strong interference from stray light in 3D imaging system of structured light,” Opt. Eng. 59(10), 105106 (2020). [CrossRef]  

19. Lawrence B. Wolff, “Using polarization to separate reflection components,” 1989 IEEE Computer Society Conference on Computer Vision and Pattern Recognition. IEEE Computer Society, 1989. [CrossRef]  

20. Shree K. Nayar, XinSheng Fang, and Terrance Boult, “Separation of reflection components using color and polarization,” International Journal of Computer Vision 21(3), 163–186 (1997). [CrossRef]  

21. Basel Salahieh, Zhenyue Chen, Jeffrey J. Rodriguez, et al., “Multi-polarization fringe projection imaging for high dynamic range objects,” Opt. Express 22(8), 10064–10071 (2014). [CrossRef]  

22. Shijie Feng, Liang Zhang, Chao Zuo, et al., “High dynamic range 3d measurements with fringe projection profilometry: a review,” Meas. Sci. Technol. 29(12), 122001 (2018). [CrossRef]  

23. Li Rao and Feipeng Da, “Local blur analysis and phase error correction method for fringe projection profilometry systems,” Appl. Opt. 57(15), 4267–4276 (2018). [CrossRef]  

24. Hongzhi Jiang, Huanjie Zhai, Yang Xu, et al., “3D shape measurement of translucent objects based on Fourier single-pixel imaging in projector-camera system,” Opt. Express 27(23), 33564–33574 (2019). [CrossRef]  

25. Zhoujie Wu, Wenbo Guo, Yueyang Li, et al., “High-speed and high-efficiency three-dimensional shape measurement based on Gray-coded light,” Photonics Res. 8(6), 819–829 (2020). [CrossRef]  

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (11)

Fig. 1.
Fig. 1. Reflected light received by the camera.
Fig. 2.
Fig. 2. Distribution of light intensity received by the camera.
Fig. 3.
Fig. 3. Experimental equipment.
Fig. 4.
Fig. 4. Calibration results. (a) Reprojection error for camera calibration; (b) calibration plate pose for camera viewpoint; (c) reprojection error for the projector; (d) calibration plate position for projector viewing angle.
Fig. 5.
Fig. 5. Images taken (a) original image; (b) polarized image after separation; (c) ambient light image.
Fig. 6.
Fig. 6. Test plots in light and dark conditions. (a) Images taken in the light condition; (b) images taken in the dark condition; (c) planar 3D map in a light environment; (d) planar 3D map in a dark environment; (e) errors in both maps.
Fig. 7.
Fig. 7. 3D reconstruction of planar metal objects before and after separation.
Fig. 8.
Fig. 8. 3D reconstruction of the metal cylinder before and after separation.
Fig. 9.
Fig. 9. 3D reconstruction of planar metal objects in indoor and outdoor environments.
Fig. 10.
Fig. 10. 3D reconstruction of metal cylinders in indoor and outdoor environments.
Fig. 11.
Fig. 11. Plot of plane fitting error analysis.

Tables (2)

Tables Icon

Table 1. Calibration result of the camera and the projector

Tables Icon

Table 2. Calculated contrast between polarized ambient light separation and conventional polarized structured light

Equations (18)

Equations on this page are rendered with MathJax. Learn more.

$${{I_\textrm{D}} = {I_\textrm{O}} + {I_\textrm{N}}}$$
$${{I_\textrm{D}} = I_\textrm{D}^ \bot + I_\textrm{D}^\parallel } $$
$${\left\{ {\begin{array}{{l}} {{I_\textrm{O}} = I_\textrm{O}^ \bot + I_\textrm{O}^\parallel }\\ {{I_\textrm{N}} = I_\textrm{N}^ \bot + I_\textrm{N}^\parallel } \end{array}} \right.} $$
$${\left\{ {\begin{array}{{l}} {I_\textrm{D}^ \bot = I_\textrm{O}^ \bot + I_\textrm{N}^ \bot }\\ {I_\textrm{D}^\parallel{=} I_\textrm{O}^\parallel{+} I_\textrm{N}^\parallel } \end{array}} \right.} $$
$${\left\{ {\begin{array}{{l}} {{S_0} = E_p^2 + E_s^{2\; \; \; }\; \; \; \; \; }\\ {{S_1} = E_p^2 - E_s^2\; \; \; \; \; \; \; }\\ {\begin{array}{{l}} {{S_2} = 2{E_p}{E_s}\cos \delta }\\ {{S_3} = 2{E_p}{E_s}\sin \delta } \end{array}} \end{array}} \right.} $$
$${P_w} = \frac{{\sqrt {S_1^2 + S_2^2 + S_3^2} }}{{{S_0}}}$$
$${M_0} = \frac{1}{2}{\left( {\frac{{tan{\theta_ - }}}{{tan{\theta_ + }}}} \right)^2}\left[ {\begin{array}{{ccc}} {co{s^2}{\theta_ - } + co{s^2}{\theta_ + }}&{co{s^2}{\theta_ - } - co{s^2}{\theta_ + }}&{\begin{array}{{ccc}} 0&{{\; \; \; \; \; \; \; \; \; \; \; \; \; \; \; \; \; \; \; \; \; \; \; \; \; \; \; }0} \end{array}}\\ {co{s^2}{\theta_ - } - co{s^2}{\theta_ + }}&{co{s^2}{\theta_ - } + co{s^2}{\theta_ + }}&{\begin{array}{{ccc}} 0&{{\; \; \; \; \; \; \; \; \; \; \; \; \; \; \; \; \; \; \; \; \; \; \; \; \; \; \; }0} \end{array}}\\ {\begin{array}{{c}} 0\\ 0 \end{array}}&{\begin{array}{{c}} 0\\ 0 \end{array}}&{\begin{array}{{cc}} { - 2cos{\theta_ - }cos{\theta_ + }}&{{\; }0}\\ 0&{ - 2cos{\theta_ - }cos{\theta_ + }} \end{array}} \end{array}} \right]$$
$$\scalebox{0.9}{$\begin{aligned} \vec{B} &= {M_0}\vec{A}\; \\ &= \frac{1}{2}{\left( {\frac{{\tan {\theta_ - }}}{{\tan {\theta_ + }}}} \right)^2}\left[ {\begin{array}{@{}cccccc@{}} {{{\cos }^2}{\theta_ - } + {{\cos }^2}{\theta_ + }}&{{{\cos }^2}{\theta_ - } - {{\cos }^2}{\theta_ + }}&0&0\\ {{{\cos }^2}{\theta_ - } - {{\cos }^2}{\theta_ + }}&{{{\cos }^2}{\theta_ - } + {{\cos }^2}{\theta_ + }}&0&0\\ 0&0&{ - 2\cos {\theta_ + }\cos {\theta_ - }}&0\\ 0&0&0&{ - 2\cos {\theta_ + }\cos {\theta_ - }} \end{array}} \right]\left[ {\begin{array}{@{}c@{}} 1\\ { \pm 1}\\ 0\\ 0 \end{array}} \right]\\ &= \frac{1}{2}{\left( {\frac{{\tan {\theta_ - }}}{{\textrm{tan} + }}} \right)^2}\left[ {\begin{array}{@{}c@{}} {2{{\cos }^2}{\theta_ - }}\\ {2{{\cos }^2}{\theta_ - }}\\ 0\\ 0 \end{array}} \right] = {\left( {\frac{{\tan {\theta_ - }}}{{\tan {\theta_ + }}}} \right)^2}{\cos ^2}{\theta _ - }\left[ {\begin{array}{{c}} 1\\ { \pm 1}\\ 0\\ 0 \end{array}} \right]\; \end{aligned}$}$$
$${\left\{ {\begin{array}{{c}} {{I_\textrm{O}} = \frac{1}{{{P_O} - {P_N}}}[{{I_{\textrm{min}}}({1 + {P_O}} )- {I_{\textrm{max}}}({1 - {P_N}} )} ]}\\ {{I_\textrm{N}} = \frac{1}{{{P_O} - {P_N}}}[{{I_{\textrm{max}}}({1 - {P_O}} )- {I_{\textrm{min}}}({1 + {P_N}} )} ]} \end{array}} \right.} $$
$${{P_N} = \frac{{I_\textrm{N}^\parallel{-} I_\textrm{N}^ \bot }}{{I_\textrm{N}^\parallel{+} I_\textrm{N}^ \bot }}} $$
$${I_\textrm{N}^\parallel{=} I_\textrm{N}^ \bot = \frac{1}{2}{I_\textrm{N}}} $$
$${{P_N} = 0} $$
$${DOLP = \frac{{{{I^{\prime}}_{max}} - {{I^{\prime}}_{min}}}}{{{{I^{\prime}}_{max}} + {{I^{\prime}}_{min}}}}} $$
$${{I_g}({x,y;t} )= \alpha \rho t \cdot {L_p}} $$
$$\begin{aligned} {P_O}({x,y} )&= \frac{{{{I^{\prime}}_{max}}({x,y} )- {{I^{\prime}}_{min}}({x,y} )}}{{{{I^{\prime}}_{max}}({x,y} )+ {{I^{\prime}}_{min}}({x,y} )}}\\ &= \frac{{\alpha \rho t \cdot ({{L_{max}}({x,y} )- {L_{min}}({x,y} )} )}}{{\alpha \rho t \cdot ({{L_{max}}({x,y} )+ {L_{min}}({x,y} )} )}} = \frac{{{L_{max}}({x,y} )- {L_{min}}({x,y} )}}{{{L_{max}}({x,y} )+ {L_{min}}({x,y} )}} \end{aligned}$$
$${\left\{ {\begin{array}{{l}} {{I_{max}} + {I_{min}} = co{s^2}\theta }\\ {{I_{max}} - {I_{min}} = co{s^2}\theta {P_w}} \end{array}} \right.}$$
$${\left\{ {\begin{array}{{l}} {{I_{\textrm{max}}} = \frac{1}{2} \times co{s^2}\theta ({1 + {P_w}} )}\\ {{I_{\textrm{min}}} = \frac{1}{2} \times co{s^2}\theta ({1 - {P_w}} )} \end{array}} \right.}$$
$${C = \mathop \sum \limits_\delta \delta {{({i,j} )}^2}{P_\delta }({i,j} )} $$
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.