Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Multi-dimensional information sensing of complex surfaces based on fringe projection profilometry

Open Access Open Access

Abstract

Multi-dimensional and high-resolution information sensing of complex surface profiles is critical for investigating various structures and analyzing their mechanical properties. This information is currently accessed separately through different technologies and devices. Fringe projection profilometry (FPP) has been widely applied in shape measurement of complex surfaces. Since structured light information is projected instead of being attached onto the surface, it holds back accurately tracking corresponding points and fails to further analyze deformation and strain. To address this issue, we propose a multi-dimensional information sensing method based on digital image correction (DIC)-assisted FPP. Firstly, colorful fluorescent markers are introduced to produce modulated information with both high-intensity reflectivity and color difference. And then, the general information separation method is presented to simultaneously acquire speckle-free texture, fringe patterns and high-contrast speckle patterns for multi-dimensional information sensing. To the best of our knowledge, this proposed method, for the first time, simultaneously realizes accurate and high-resolution 2D texture (T), 4D shape (x, y, z, t) and analytical dimensional mechanical parameters (deformation (d), strain (s)) information sensing based on the FPP system. Experimental results demonstrate the proposed method can measure and analyze 3D geometry and mechanical state of complex surfaces, expanding the measuring dimension of the off-the-shelf FPP system without any extra hardware cost.

© 2023 Optica Publishing Group under the terms of the Optica Open Access Publishing Agreement

1. Introduction

Multi-dimensional information sensing plays an important role in current technological developments [1]. With the increase in multi-disciplinary demands for time-varying two-dimensional (2D) textures, three-dimensional (3D) shapes, and corresponding mechanical properties, sensing technologies need to possess the multi-dimensional measuring capacities for a wide range of applications.

Among many optical sensing and measuring methods, fringe projection profilometry (FPP) has been widely used for mechanical engineering, augmented reality, and virtual reality [24]. Accurate 2D texture and 3D shape are acquired by projecting a series of structured light patterns, which are recorded by a camera from another angle. For the traditional FPP, a series of patterns are needed for reconstructing one 3D data, which limits the application in dynamic measurements. In addition, the fringes are projected but not attached onto the tested surface, which makes it fail to track the deformation of the corresponding points and perform accurate deformation and strain analysis. However, there are increasing demands to further measure and analyze transient shape, and mechanics properties in fields such as motion reconstruction [5,6], mechanical analysis [7,8], and so on, which require higher measurement dimension for the traditional FPP system.

On the side of dynamic property analysis, accurately characterizing the position, velocity, acceleration, and other relevant information of an object or scene in motion is of paramount importance for comprehending its underlying physical properties and driving forward its applications [912]. To achieve this, researchers have conducted a series of studies focusing on improving the projecting speed of hardware [1315], and the coding efficiency of algorithms [16,17]. Zhang et al. combined the binary defocusing technique and the two-frequency binary phase-shifting to realize the 3D reconstruction of rabbit heartbeat at 667 Hz [18]. Stefan Heist et al. developed a high-speed shape measurement method based on GOBO projection achieving a fast reconstruction of the airbag popping process at a point cloud reconstruction rate of 1333 frames per second [19]. Zuo et al. proposed the µFTP technology to complete an absolute 3D measurement based on Fourier transform profilometry (FTP) reconstruction of a transient scene at a 3D reconstruction speed of 10,000 fps [20]. Wu et al. proposed a robust and efficient measuring method based on Gary code light projection, which decreased the number of projection patterns in each sequence to 3 frames and achieved dynamic measurement in noisy transient scenes at a maximum reconstruction rate of 3174 fps [21,22]. These methods have significantly improved the measuring efficiency of FPP systems and extended measuring dimensions from 3D (x, y, z) to 4D (x, y, z, t).

On the side of mechanical analysis, further information on deformation and strain is expected to be obtained to analyze the mechanical parameters of structures and materials. Digital image correlation (DIC) technology is commonly employed in measuring deformation and strain information of objects [2326]. Therefore, scholars have integrated DIC techniques into FPP systems to expand the analytical dimension of FPP systems. Initially, researchers directly combined FTP technology and DIC technology [27,28], utilizing DIC for the measurement of in-plane displacement and using FTP for the measurement of 3D shape and out-of-plane deformation. However, due to the filtering operation inherent in the FTP technology, the reconstructed surface details cannot be accurately preserved. To address this limitation, researchers employed phase measuring profilometry (PMP) technology to precisely measure the 3D shape of objects, and the phase-shifting fringe pattern was used to extract texture maps to analyze in-plane displacement, which improved the accuracy of measurement results [2931].

However, these two techniques have contradictory requirements on the surface texture. The DIC technology expects the obvious reflectivity difference with fabricating speckle patterns on the tested surface to ensure accurate image matching and deformation calculation, while the FPP technology requires the reflectivity of the measured surface to be uniform for high-accuracy shape measurement. To tackle this contraction, Philip Siegmann et al. employed red speckles along with projected blue and white fringe and demodulated fringes and speckles in various color channels [32,33]. However, the encountered color crosstalk problem caused the incomplete information separation. Wu et al. proposed a method for filtering speckles to minimize the effect of speckles in shape reconstruction [34]. And then conducting intensity-chromatic analysis to mitigate this issue [35]. Although they successfully measured complex structures, the selection of speckle colors was constrained, and fixed linear conversion coefficients were unavailable to entirely separate speckle-free fringe patterns and texture maps. And it is worth noting that a clean and speckle-free texture map is a very big promotion to visualization of the rendering results. Likewise, this is also a common problem in DIC, where speckles with strong grayscale contrasts destroy the natural texture of tested surfaces, and thus fails to achieve multi-dimensional information rendering with the original texture.

The above-mentioned approaches can further extend the FPP system to the analytical dimension and realize the deformation analysis but fail to solve the contradiction between 2D texture image and deformation analysis. Therefore, the challenging task is how to simultaneously obtain speckle-free texture and fringe patterns to perform accurate texture and shape reconstruction for 2D and 3D visualization and get high-contrast speckle patterns to realize high-accuracy deformation and strain measurement for analysis of mechanics properties.

Therefore, in this work, a multi-dimensional information sensing method based on DIC-assisted FPP is proposed to simultaneously realize sensing of 2D texture (T), 4D shape (x, y, z, t), and analytical dimensional mechanical parameters (deformation (d), strain (s)). Fluorescent pigments are used to fabricate speckle patterns on the tested surfaces, guaranteeing both high surface reflectivity and color difference of marker. And then, the general information separation method is presented to thoroughly separate and acquire high-quality and non-interfering texture, fringe, and speckle patterns. Based on the separated raw information, multi-dimensional information sensing on complex surfaces can be realized using the developed DIC-assisted FPP method. The proposed method can greatly upgrade the information sensing ability of traditional FPP system without any extra hardware cost. Compare to stereo-DIC method which can simultaneously achieve 3D shape, deformation, and strain measurement, the proposed method have superiorities in obtaining speckle-free texture, reconstructing shapes of fine structures, computational efficiency and hardware cost.

2. Methods

2.1 Principle of DIC-assisted-FPP system

The DIC-assisted FPP system is depicted in Fig. 1. Initially, the speckle pattern is fabricated on a tested surface, and then phase-shifting sinusoidal fringes and coded patterns are projected onto the surface and modulated by the object’s height distribution. The camera captures the deformed patterns from a different angle. Subsequently, the wrapped phase is extracted using a phase-shifting algorithm and the absolute unwrapping phase is demodulated with coding levels. Finally, the 3D shape is then reconstructed after system calibration. Meanwhile, the speckle patterns extracted from the phase-shifting patterns are utilized for DIC analysis to establish the correspondence of 3D point clouds at different sampling moments. Finally, deformation and strain are obtained by performing difference and differential operations between the corresponding point clouds.

 figure: Fig. 1.

Fig. 1. Principle of DIC-assisted-FPP system.

Download Full Size | PDF

2.2 Contradictory requirements on surface reflectivity between FPP and DIC

The quality of the fringe and speckle information is critical to the measuring accuracy of the 3D shape, deformation, and strain measurements. In actual measurements, the image quality is affected by modulation, noise, and reflectivity, considering these factors, take the three-step phase-shifting algorithm which is commonly used in dynamic measurements as an example, the fringe captured by the camera can be described as

$${I_i}\left( {x,y} \right) = R\left( {x,y} \right)A\left( {x,y} \right) + R\left( {x,y} \right)B\left( {x,y} \right)\cos [\varphi \left( {x,y} \right) - {\delta _i}] + {\Delta }{n_i},i = 1,2,3,$$
in which, A(x, y) is the background light intensity, B(x, y) is the fringe modulation, and φ(x, y) represents the phase carrying the surface shape information of the object. R(x, y) is the surface reflectivity, Δni is the noise obeying Gaussian distribution. The wrapped phase can be expressed as:
$$\phi ({x,y} )= \textrm{artan}[\frac{{\sum\nolimits_{i = 1}^N {({{I_i}({x,y} )\sin {\delta_i}} )} }}{{\sum\nolimits_{i = 1}^N {({{I_i}({x,y} )\cos {\delta_i}} )} }}].$$

According to Li’s noise model [36], the phase error variance Δφ caused by noise can be described as (x and y are omitted for simplicity):

$${\sigma ^2}_{\Delta\varphi } = \frac{2}{N}\frac{{{\sigma _{noise}}^2}}{{{R^2}{B^2}}}.$$

When noise σnoise, phase-shifting step N and fringe modulation B are fixed, it can be seen from Eq. (3) that the standard deviation of phase error σΔφ is inversely proportional to the reflectivity R. As shown in the color card in Fig. 2(a), the reconstruction results show a large error in the black region where the reflectivity is low, so the surface reflectivity of the object is required to be high enough to ensure the accuracy of the phase measurement, which directly determines shape measurement accuracy.

 figure: Fig. 2.

Fig. 2. Accuracy requirements for 3D shape measurement systems. (a) Reconstruction results of color cards and throw lines. (b) Optical path diagram for discontinuous reflectivity conditions. (c) Table tennis ball with different textures and the corresponding reconstruction results.

Download Full Size | PDF

In addition, there is no pixel-to-pixel precise correspondence between the projecting plane and the camera imaging plane in actual measurement as shown in Fig. 2(b). Multiple projecting pixels (u0, v0), (u1, v1) and (u2, v2) will be received by one camera pixel (u1*, v1*), and when the reflectivity is uneven on a local region, it will cause the phase errors ΔϕR(x, y) which can be expressed as [37]:

$$\Delta {\phi _R}(x,y) = {\tan ^{ - 1}}\left[ {\frac{{R({{u_2},{v_2}} )- R({{u_1},{v_1}} )}}{{R({{u_2},{v_2}} )+ R({{u_1},{v_1}} )}}erf(\sqrt 2 \pi f\sigma )} \right],$$
in which, R is the surface reflectivity. Equation (4) shows surface reflectivity R is expected to be uniform enough for higher phase accuracy (related to shape accuracy). Otherwise, it will result in the shape reconstruction error of the table tennis as shown in Fig. 2(c). Theoretical analysis demonstrates that the reflectivity of object surface is required to be high and uniform for higher shape measurement accuracy in FPP.

On the other side, the displacement error caused by noise in digital image correlation can be described as follows according to Pan’s theoretical model of displacement measurement accuracy [38]:

$${\sigma ^2}_{\Delta{dis}} = \frac{{{\sigma _{noise}}^2}}{{{B^2}\sum {\sum {{{\left( {{R_{xy}}} \right)}^2}} } }}.$$

From Eq. (5), when noise σnoise and B are fixed, it is expected that the reflectivity gradient Rxy of surface is large enough to obtain a higher displacement accuracy in actual measurements.

The above accuracy evaluations show that FPP and DIC are contradictory in the surface reflectivity requirement. It requires high and uniform reflectivity for high-quality fringe information for FPP but a high reflectivity gradient for high-contrast speckle information for DIC. And the existing methods can only partially solve this problem. In addition, a speckle-free texture map is also critical to render the multi-dimensional imaging result, which is also unavailable in existing methods.

2.3 Multi-dimensional information sensing based on DIC-assisted FPP

This contradiction is difficult to resolve in intensity space, so we introduced the color space and proposed a multi-dimensional information sensing method. Firstly, colored fluorescent marker points are fabricated on the surface of the tested object. And then, a general information separation method is proposed to make use of the color difference between the background and the speckle in the RGB color space. And the fringe and speckle information can be completely separated by solving the optimal conversion coefficients. Finally, utilizing the separated speckle-free fringe and texture patterns and the high-contrast speckle pattern, multi-dimensional information sensing including 2D texture, 4D shape and analytical dimensional mechanical parameters can be achieved based on the off-the-shelf FPP system as shown in Fig. 3(b). Compared with the interference of information in monochrome cameras, color cameras and the demodulation algorithm also resulted in accuracy loss. However, after separation for fringes and speckles by the general information separation method, the loss of accuracy was within acceptable range.

 figure: Fig. 3.

Fig. 3. Comparison of information accessibility between (a) traditional FPP measurement system and (b) FPP-based multi-dimensional information sensing system.

Download Full Size | PDF

2.3.1 General information separation method based on RGB coefficients conversion

Aiming at the contradiction of requirements on surface reflectivity, a general information separation method is proposed:

$$\left( {\begin{array}{{cc}} {I_{_f}^b}&{I_{_e}^b}\\ {I_{_f}^s}&{I_{_e}^s} \end{array}} \right) = \left( {\begin{array}{{ccc}} {I_{_r}^b}&{I_{_g}^b}&{I_{_b}^b}\\ {I_{_r}^s}&{I_{_g}^s}&{I_{_b}^s} \end{array}} \right)\left( {\begin{array}{{cc}} {{F_1}}&{{E_1}}\\ {{F_2}}&{{E_2}}\\ {{F_3}}&{{E_3}} \end{array}} \right).$$

In this equation, $I_r^b$, $I_g^b$, and $I_r^b$ represent three-channel (RGB) intensity of the object surface; $I_r^s$, $I_g^s$, and $I_r^s$ represent that of the marker information. $I_f^s$ and $I_e^s$ represent the intensity value of the marker information transformed by the speckle-free coefficients F and the speckle-extraction coefficients E. $I_f^b$ and $I_e^b$ represent the intensity value of the background information transformed by the coefficients F and E, respectively. Then, we solve for the unknown coefficients F, which enhances and equates the magnitude of $I_f^b$ and $I_f^s$ (fringe intensity), and find another set of E, which maximizes the difference ΔI = ($I_e^b$-$I_e^s$) based on the intensity information. The ideal result after conversion is shown in Fig. 4, which enables us to demodulate the speckle information with a significant reflectivity gradient and fringe information with high reflectivity and uniformity. The key steps involve selecting the speckled color and determining the values of the unknown variables.

 figure: Fig. 4.

Fig. 4. Schematic diagram of the general reflectivity gradient modulation method.

Download Full Size | PDF

First and foremost, the selection of speckle color plays a crucial role in determining the conversion coefficients and ensuring the accuracy of shape and deformation analysis. When dealing with a test component whose background color has already been determined, the advantages of high reflectivity and vividness of fluorescent speckles are applicable to our model [39,40], as they ensure high modulation MI and color difference DC. It helps us to distinguish background and speckle areas by calculating the histogram distribution in the color channel with the largest color difference. Therefore, in order to find the most suitable speckle color, we mark the speckles using an intensity equation and represent the color difference using the Euclidean distance in the RGB color space.

$$\left\{ \begin{array}{l} {M_I} = 0.299I_{_r}^s + 0.587I_{_g}^s + 0.114I_{_b}^s\\ {D_C} = \sqrt {{{(I_{_r}^b - I_{_r}^s)}^2} + {{(I_{_g}^b - I_{_g}^s)}^2} + {{(I_{_b}^b - I_{_b}^s)}^2}} \\ I_{_r}^s,I_{_g}^s,I_{_b}^s,I_{_r}^b,I_{_g}^b,I_{_b}^b \in [0,255] \end{array} \right..$$

Using the maximum MI + DC as the objective function, the optimization goal is to maximize speckle reflectivity and color variance. The appointable speckle color can be obtained by boundary-constrained multiparameter optimization method [41]:

$$\left\{ \begin{array}{l} \textrm{Objective Function}:\,\textrm{max}({{D_C}({I_{_r}^s,I_{_g}^s,I_{_b}^s} )+ {{\rm M}_{\rm I}}({I_{_r}^s,I_{_g}^s,I_{_b}^s} )} )\\ \textrm{Bound Constraints Conditions}:\,I_{_r}^s,I_{_g}^s,I_{_b}^s \in \textrm{[0, 255]} \end{array} \right..$$

After achieving three-channel intensity of the background and speckle of the object, the indefinite equation of F and E can be obtained:

$$\left\{ \begin{array}{l} I_{_f}^b = I_{_f}^s = {F_1} \times I_{_r}^b + {F_2} \times I_{_g}^b + {F_3} \times I_{_b}^b = {F_1} \times I_{_r}^s + {F_2} \times I_{_g}^s + {F_3} \times I_{_b}^s\\ \Delta I = {E_1} \times \Delta {I_\textrm{r}}\textrm{ + }{E_\textrm{2}} \times \Delta {I_\textrm{g}}\textrm{ + }{E_3} \times \Delta {I_\textrm{b}} = {E_1} \times ({I_{_r}^b - I_{_r}^s} )\textrm{ + }{E_\textrm{2}} \times ({I_{_g}^b - I_{_g}^s} )\textrm{ + }{E_3} \times ({I_{_b}^b - I_{_b}^s} )\end{array} \right.,$$
in which, utilizing the coefficients F to maximize fringe intensity I and the coefficients E to maximize speckle contrast ΔI are the optimization objectives.

There is no unique solution and the optimal speckle-free coefficients F and speckle-extraction coefficients E in Eq. (9). Therefore, maximizing fringe’s signal-to-noise ratio is introduced as another constraint to find the optimal solution. Since most color cameras use the Bayer array as a color filter array [42], the distribution characteristics of the Bayer array makes the noise in the green channel about half of that in the red and blue channels, thus the intensity model and noise model are established:

$$\left\{ \begin{array}{l} \sigma K = |{{F_1} \times {\sigma_\textrm{r}}} |+ |{{F_2} \times {\sigma_\textrm{g}}} |+ |{{F_3} \times {\sigma_\textrm{b}}} |= ({2{F_1} + {F_2} + 2{F_3}} )\times {\sigma_\textrm{g}}\\ f({{F_1},{F_2},{F_3}} )= {\left( {\frac{{I_{_f}^b}}{{{\sigma_F}}}} \right)_{\max }}\\ \sigma T = |{{E_1} \times {\sigma_\textrm{r}}} |+ |{{E_2} \times {\sigma_\textrm{g}}} |+ |{{E_3} \times {\sigma_\textrm{b}}} |= ({2{E_1} + {E_2} + 2{E_3}} )\times {\sigma_\textrm{g}}\\ f({{E_1},{E_2},{E_3}} )= {\left( {\frac{{\Delta I}}{{{\sigma_E}}}} \right)_{\max }}\\ {F_1},{F_2},{F_3},{E_1},{E_2},{E_3} \in (0,1) \end{array} \right.,$$
in which, σr, σgand σb are noise in RGB channels; σFand σE represent noise of speckle-free patterns and speckle-extraction patterns.

As shown in Fig. 5, take speckle-free coefficients F as an example, we first set the initial values F1 that keep growing from zero. According to F1, Eqs. (9) and (10) at this time, we calculate F2, F3 and the signal-to-noise ratio, the signal-to-noise ratio shows a increasing and then decreasing trend, we obtain the best coefficients F at the inflection point (the largest signal-to-noise ratio).

 figure: Fig. 5.

Fig. 5. Optimization of speckle-free coefficients F and extraction coefficients E.

Download Full Size | PDF

In actual experiment, for objects with uniformly color texture, after determining the best speckle-free coefficients F and speckle-extraction coefficients E, each pixel was multiplied in all color patterns using the same transforming coefficients. For objects with the complex color texture, the texture areas were divided according to different texture colors, and then speckles and textures were distinguished according to the color differences of speckle and texture within these areas Speckle-free textures and speckle-extraction patterns are obtained by calculating the corresponding coefficients for each area, respectively. We multiply each corresponding color texture area in all color patterns with the corresponding transforming coefficients. This process represents a linear transformation, which ensures that the sinusoidal nature of the modulated phase-shifting fringe remains unchanged and the accuracy of the calculated wrapped phase using the phase-shifting algorithm. The generalizability of this model is embodied in allowing us to determine the most suitable speckle color for any background color. Even when the speckle is not optimal, the fringe information can still be calculated with high and uniform reflectivity and the marking information can simultaneously be retrieved with a large reflectivity gradient. However, the signal-to-noise ratio may be lower compared to the best speckle color.

2.3.2 Multi-dimensional information sensing

After obtaining the gray image converted by the two sets of coefficients, we further calculate speckle-free texture, high-contrast speckle, shape, displacement, and strain to achieve multi-dimensional information sensing through the steps shown in Fig. 6.

 figure: Fig. 6.

Fig. 6. Calculation process for multi-dimensional information.

Download Full Size | PDF

We first calculate the 3D shape using the speckle-free fringe pattern by the FPP method. At the same time, to mitigate the effects of ambient light changes and fringes, the modulation component is extracted as the speckle-free texture and high-quality speckle patterns from phase-shifting patterns.

$$M(x,y) = \frac{2}{3}\sqrt {{{\left[ {\sum\limits_{i = 1}^3 {{I_i}({x,y} )} \sin ({2\pi i/3} )} \right]}^2} + {{\left[ {\sum\limits_{i = 1}^3 {{I_i}({x,y} )} \cos ({2\pi i/3} )} \right]}^2}} .$$

Texture maps is extracted for better visualization and analysis, while the speckle maps are employed for further tracking corresponding points of the reconstructed 3D shape before and after deformation. To optimize the position of corresponding points in the deformation image, we selected the zero-mean normalized sum of squared differences (ZNSSD) as the correlation criterion [43].

$${C_{ZNSSD}}(P) = \sum {{{\left[ {\frac{{F({x,y} )- \bar{F}}}{{\sqrt {\sum {_\Omega } {{({F({x,y} )- \bar{F}} )}^2}} }} - \frac{{G({{x^ \ast },{y^ \ast }} )- \overline G }}{{\sqrt {\sum {_\Omega } {{({G({{x^ \ast },{y^ \ast }} )- \overline G } )}^2}} }}} \right]}^2}} ,$$
in which, Ω represents the selected subregion in the reference image, F(x, y) and G(x*, y*) represent the intensity in the reference image and the deformation image, and p = (u, v, ux, uy, vx, vy) represents the deformation parameter, and determine the coordinate changes of the corresponding points in the two subregion images.

In the fringe projection measurement system, all deformation patterns are obtained from the same perspective at different times, and the IC-GN algorithm with first-order deformation parameters can meet the time matching of most deformation measurements in engineering applications. Therefore, this paper adopts the first-order IC-GN algorithm to track the changes on the time axis of objects [44]

$$\left\{ \begin{array}{l} {x^ \ast } = x + u + {u_x}({x - {x_0}} )+ {u_y}({y - {y_0}} )\\ {y^ \ast } = y + v + {v_x}({x - {x_0}} )+ {v_y}({y - {y_0}} )\end{array} \right..$$

After establishing the position relationship of the corresponding points at different moments, combined with the reconstructed 3D shape information, accurate 3D displacement or deformation can be obtained in the time domain by directly subtracting the 3D coordinates of the corresponding points [31]. Because the high-speed camera collects discrete information of the digital image, in the process of dynamic measurement, the coordinates of the matching points after the deformation of the measured object may fall on the sub-pixel position, so the 3D deformation coordinates on the sub-pixel points are calculated by cubic spline interpolation method [45].

$$\left\{ \begin{array}{l} U(x,y) = X({{x^ \ast },{y^ \ast },{t_2}} )- X({x,y,{t_1}} )\\ V(x,y) = Y({{x^ \ast },{y^ \ast },{t_2}} )- Y({x,y,{t_1}} )\\ W(x,y) = Z({{x^ \ast },{y^ \ast },{t_2}} )- Z({x,y,{t_1}} )\end{array} \right..$$

For the acquisition of strain information, further derivation of displacement is obtained [46]:

$${\varepsilon _{xx}} = \frac{{\partial \Delta x}}{{\partial x}},{\varepsilon _{yy}} = \frac{{\partial \Delta y}}{{\partial y}},{\varepsilon _{xy}} = {\varepsilon _{yx}} = \frac{1}{2}(\frac{{\partial \Delta x}}{{\partial y}} + \frac{{\partial \Delta y}}{{\partial x}}),$$
in which, εxx represents transverse strain, εxy represents shear strain, εyyrepresents longitudinal strain.

3. Experiments and results

An FPP-based system consisting of a high-speed camera and a projector was developed. And then, we conducted experiments to evaluate the measuring accuracy of shape and deformation as well as to achieve the multi-dimensional information sensing on complex surfaces.

3.1 Accuracy evaluation experiment

3.1.1 Shape accuracy evaluation

To verify the reconstruction accuracy of the proposed speckle-free method in shape reconstruction, we conducted a comparative experiment on table tennis balls before and after spraying speckles and evaluated the diameter error. Firstly, the diameter of the table tennis ball without speckle spray was measured to be 39.3994 mm using a 12-step phase-shifting measurement, which serves as the ground truth for the evaluation. Figure 7(a) shows captured three-step fringe patterns in the experiment, where the table tennis balls were sprayed with green, fluorescent speckles. Figure 7(b) presents a reflectivity comparison recovered by the traditional intensity extraction method (Eliminating hue and saturation information while retaining brightness that matches the human eye's light curve) and the proposed method. Figure 7(d)-7(e) display the 3D reconstructed result and fitting result of the table tennis by proposed method. The difference (ΔD) in diameter between the fitted ball and the ground truth is 0.0344 mm, with a standard deviation (SD) of 0.0942 mm. As a comparison, Fig. 7(f) shows the reconstructed result directly using the intensity extraction coefficients, and the reconstructed surface has many irregularities, with a standard deviation of 0.1279 mm. Results indicate the proposed method can obtain high-quality fringe information and have better performances in shape reconstruction.

 figure: Fig. 7.

Fig. 7. Accuracy evaluation of 3D shape on table tennis (a) Captured patterns. (b) Intensity extraction method and proposed method. (c) Profile line of the fringes. (d)-(e) 3D reconstruction and fitting results using speckle-free coefficients. (f) 3D reconstruction and fitting results using intensity extraction coefficients.

Download Full Size | PDF

3.1.2 Deformation accuracy evaluation

To quantitatively evaluate the accuracy of speckle-extraction method in the established deformation measurement system, we have established the following experimental setup for displacement accuracy evaluation.

Firstly, a standard plate with contrasting speckled patterns was designed, and precise positioning was achieved using a translation stage (ZolixPA400), which independently moved three times in directions of X, Y, and Z, with 10 mm intervals. The position of the test plate and the speckle-extraction result of patterns are shown in Fig. 8(a). The displacement calculation results after moving the translation stage are shown in Fig. 8(b). The cross profiles of the middle row are shown in Fig. 8(c). Figure 8(d) shows the error distribution between the measured displacement of the standard plane and its fitted planes at nine positions, demonstrating the flatness of the measured displacement plane. Finally, the displacement of the translation stage data (10 mm, 20 mm and 30 mm) is used as ground truth to compute displacement difference (ΔD) and root mean square error (RMSE) of X, Y, and Z, as illustrated in Table 1. Maximum displacement difference between measured data and ground truth is 0.1311 mm, and the RMSE is within 0.1085 mm.

 figure: Fig. 8.

Fig. 8. Evaluation of the deformation accuracy of this measuring system under the extraction coefficients. (a) Measuring positions of the standard flat and a reflectivity gradient modulation result of speckles. (b) X, Y, and Z displacement measurements when the movement in three directions is 10 mm. (c) Displacement curve of the middle row. (d) Errors between the displacements and their fitted planes at nine positions.

Download Full Size | PDF

Tables Icon

Table 1. Deformation errors (ΔD) and RMSE of the nine positions

Therefore, the measured results indicate that the measuring accuracy of X and Y displacements are far higher than that of Z deformation. Because the accuracy of the standard plate in the Z direction is limited by the depth of field of the projector and the camera, making the sinusoidal quality of the fringes decrease. These results verify that our measuring system can obtain simultaneously accurate in and out-of-plane deformation.

3.2 Multi-dimensional information sensing for complex surface

3.2.1 Measurement on the composite braided structure

To demonstrate the superiority of the proposed method in dynamic measurement scenes of complex objects, we built the measurement system as shown in Fig. 9 to perform dynamic multi-dimensional analysis of fine structures and damping structures. It includes a projector with a resolution of 912 × 1140 pixels (DLPLightcrafter4500), a COMS color camera with a resolution of 1096 × 1600 pixels (HB-1800-SC), and a tested object. A shifting Gray-code light projection with a time-overlapping coding strategy was used, in which the period of phase-shifting fringes was 32. Two Gray-code patterns were embedded after each set of phase-shifting fringes, and avoided jump errors by using tripartite phase unwrapping method [47]. Projector and camera rates are synchronized at 588 fps, with the reconstructed rate is 588/5 = 117.6 fps.

 figure: Fig. 9.

Fig. 9. Main view and vertical view of the experimental device.

Download Full Size | PDF

Braided composite structures have excellent mechanical properties [48], and have been widely used in automotive, aerospace, and other fields. As shown in the vertical view of Fig. 9, this process simulates the impact process for a woven structure when it is squeezed by a conical object at constant velocity. And it is helpful to quantitatively study the relationship between woven structure distribution and its mechanical property.

Figure 10 shows the complete measurement process of the proposed DIC-assisted FPP system, which can realize multi-dimensional visualization and analysis of fine structures, mainly divided into the following steps:

 figure: Fig. 10.

Fig. 10. Measured results of braided composite structure (a) Captured temporal overlapping encoding (3 + 2) patterns (b) Information separation result. (c) Multi-dimensional information. (d)-(f) Strain maps (εxx, εxy, and εyy) at five moments. (g)-(h) Detail amplification of εxx and εyy at 85.30 ms. (Visualization 1)

Download Full Size | PDF

Step 1: As shown in Fig. 10(a), after making high reflectivity and bright fluorescent speckles on the surface of the object and projecting the gray fringe, the time-overlapping coding strategy [21] is used to project the binary mode and gray coding mode and capture them simultaneously

Step 2: As shown in Fig. 10(b), Calculating the speckle-free coefficients F and speckle-extraction coefficients E of the captured pattern by the general information separation method and use these two coefficients to convert the captured pattern to obtain two sets of fringe patterns and use the modulation system to extract fringe-free texture and high-contrast speckle information.

Step 3: As shown in Fig. 10(c), The speckle-free fringe patterns are used to reconstruct 3D shapes and new 3D data can be updated with every 5 patterns obtained. At the same time, the texture information is combined with the 3D results to obtain 3D texture rendering for auxiliary strain analysis. The speckle pattern uses DIC for 2D image matching and tracking, uses 3D shape and 2D matching information to achieve the 3D coordinate matching, obtains displacement information by subtracting three-dimensional coordinates, and further calculates partial derivatives to obtain transverse, longitudinal, and shear strains.

Step 4: Figs. 10(d)-(f) shows the multi-dimensional information sensing combining time-varying texture, shape, deformation, and strain analyses, which provides a more intuitive rendering of the state of complex surface under stress in a real scene compared to traditional analytical methods

According to the analysis of texture and strain when the composite braided structure was impacted in Figs. 10(g)-(h) and dynamic results are shown in Visualization 1. The intersection of the transverse and vertical axes closest to the stress point of the woven object has the largest strain, and the force at the structure interlace is greater than that at other surrounding points. Moreover, the tiny protruding texture on the surface also has a guiding effect on the stress release, which verifies the non-uniform stress of the beam structure, thus revealing the structure's dislocation and the material's inherent defects. Experiments show that the method can help researchers analyze the defects and mechanical properties of samples with fine structures.

3.2.2 Measurement on the damping structure

Then, the method was used to measure the scene in which the foam insole was put under pressure when the pedestrian was walking. The separation process of fringe and speckle patterns was shown in Fig. 11(a). The dynamic reconstructed shape at the three representative moments was shown in Fig. 11(b), and the complex granular shape of the surface was also accurately reconstructed. The calculated strain at these three times is shown in Fig. 11(c) and dynamic results are shown in Visualization 2. When the pressure increased to maximum, the compression degree of the insole reached the maximum, the lateral strain was derived horizontally, and the longitudinal strain concentrated at the sole point. This experiment demonstrates that the proposed method offered an alternative approach for analyzing and validating the correlation between fine structures and forces.

 figure: Fig. 11.

Fig. 11. Measurement result of damping structure. (a) Actual scenario and information analysis. (b) Measurement result of shape. (c) Strain maps at corresponding moments. (Visualization 2)

Download Full Size | PDF

4. Conclusions

In this paper, a general information separation method is proposed to realize multi-dimensional information sensing including speckle-free 2D texture (T), accurate 4D dynamic shape (x, y, z, t), and analytical dimensional mechanical parameters (deformation (d), strain (s)), by combining the advantages of FPP and DIC in their respective measurement dimension. The proposed method satisfies both FPP and DIC requirements for surface reflectivity, thus upgrading measuring dimension of the conventional FPP system. Compared with existing methods, the proposed method stands out in the following aspects:

  • General and superior performance on information separation. Deformation and strain measurements must fabricate a high-contrast and dense speckles for accurate deformation analysis, which often affects the final texture imaging. On the contrary, the proposed method can achieve high-quality information of speckle-free original texture and high-contrast speckle maps, as if no speckle is fabricated for 2D texture and 4D shape. In actual experiments, even for objects with complex color textures, the relative positions of produced speckles and textures remain constant during deformation. We can divide the areas according to different colors and calculate the corresponding coefficients for each area to obtain speckle-free and speckle-extraction information. this information can be calculated for any object and speckle color by applying the separation equations, except that the signal-to-noise ratio will be lower than the optimal speckle color. It thoroughly overcomes the inherent contradiction between FPP and DIC and affords the possibility to take full advantage of these two techniques.
  • Multi-dimensional information perception capability. Our measurement method can realize the acquisition of 2D texture and 3D geometry and analysis of deformation and strain on complex surface based on off-the-shelf FPP system without any additional hardware costs. By providing the sensing results, the proposed method can offer an alternative method for bridging 2D texture, 3D geometry, and mechanical state of complex specimens.

Funding

National Natural Science Foundation of China (62205226, 62075143); National Postdoctoral Program for Innovative Talents (BX2021199); Key Research and Development Program of Jiangxi Province (20224AAC01011); China Postdoctoral Science Foundation (2022M722290); Fundamental Research Funds for the Central Universities (2022SCU12010).

Disclosures

The authors declare no conflicts of interest.

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

References

1. S. Heist, C. Zhang, K. Reichwald, et al., “5D hyperspectral imaging: fast and accurate measurement of surface shape and spectral characteristics using structured light,” Opt. Express 26(18), 23366–23379 (2018). [CrossRef]  

2. K. R. Ford, G. D. Myer, and T. E. Hewett, “Reliability of landing 3D motion analysis: implications for longitudinal analyses,” Med Sci Sports Exerc. 39(11), 2021–2028 (2007). [CrossRef]  

3. J. Geng, “Structured-light 3D surface imaging: a tutorial,” Adv. Opt. Photonics 3(2), 128–160 (2011). [CrossRef]  

4. J. Xu and S. Zhang, “Status, challenges, and future perspectives of fringe projection profilometry,” Opt. Lasers Eng. 135, 106193 (2020). [CrossRef]  

5. F. Patrona, A. Chatzitofis, D. Zarpalas, et al., “Motion analysis: Action detection, recognition and evaluation based on motion capture data,” Pattern Recognit 76, 612–622 (2018). [CrossRef]  

6. S. Zhang, “High-speed 3D shape measurement with structured light methods: A review,” Opt. Lasers Eng. 106, 119–131 (2018). [CrossRef]  

7. C. Zhou and J. G. Chase, “Low-cost structured light imaging of regional volume changes for use in assessing mechanical ventilation,” Comput Methods Programs Biomed 226, 107176 (2022). [CrossRef]  

8. F. Grytten, E. Fagerholt, T. Auestad, et al., “Out-of-plane deformation measurements of an aluminium plate during quasi-static perforation using structured light and close-range photogrammetry,” Int. J. Solids Struct. 44(17), 5752–5773 (2007). [CrossRef]  

9. Q. Zhang, L. Huang, Y.-W. Chin, et al., “4D metrology of flapping-wing micro air vehicle based on fringe projection,” Proc. SPIE 8769, 87692Y–760 (2013). [CrossRef]  

10. C. Jiang, P. Kilcullen, Y. Lai, et al., “High-speed dual-view band-limited illumination profilometry using temporally interlaced acquisition,” Photonics Res. 8(11), 1808–1817 (2020). [CrossRef]  

11. J. Cheng, L. Zhang, Q. Chen, et al., “A review of visual SLAM methods for autonomous driving vehicles,” Eng Appl Artif Intell 114, 104992 (2022). [CrossRef]  

12. J. Jiao, “Machine learning assisted high-definition map creation,” in 2018 IEEE 42nd Annual Computer Software and Applications Conference (IEEE, 2018), pp. 367–373.

13. S. Lei and S. Zhang, “Flexible 3-D shape measurement using projector defocusing,” Opt. Lett. 34(20), 3080–3082 (2009). [CrossRef]  

14. P. Wissmann, F. Forster, and R. Schmitt, “Fast and low-cost structured light pattern sequence projection,” Opt. Express 19(24), 24657–24671 (2011). [CrossRef]  

15. S. Heist, A. Mann, P. Kühmstedt, et al., “Array projection of aperiodic sinusoidal fringes for high-speed three-dimensional shape measurement,” Opt. Eng 53(11), 112208 (2014). [CrossRef]  

16. Y. Wu, G. Wu, L. Li, et al., “Inner shifting-phase method for high-speed high-resolution 3-D measurement,” IEEE Trans. Instrum. Meas. 69(9), 7233–7239 (2020). [CrossRef]  

17. X. He, D. Zheng, Q. Kemao, et al., “Quaternary gray-code phase unwrapping for binary fringe projection profilometry,” Opt. Lasers Eng. 121, 358–368 (2019). [CrossRef]  

18. Y. Wang, J. I. Laughner, I. R. Efimov, et al., “3D absolute shape measurement of live rabbit hearts with a superfast two-frequency phase-shifting technique,” Opt. Express 21(5), 5822–5832 (2013). [CrossRef]  

19. S. Heist, P. Lutzke, I. Schmidt, et al., “High-speed three-dimensional shape measurement using GOBO projection,” Opt. Lasers Eng. 87, 90–96 (2016). [CrossRef]  

20. C. Zuo, T. Tao, S. Feng, et al., “Micro Fourier transform profilometry (µFTP): 3D shape measurement at 10,000 frames per second,” Opt. Lasers Eng. 102, 70–91 (2018). [CrossRef]  

21. Z. Wu, W. Guo, Y. Li, et al., “High-speed and high-efficiency three-dimensional shape measurement based on Gray-coded light,” Photonics Res. 8(6), 819–829 (2020). [CrossRef]  

22. Z. Wu, W. Guo, Q. Zhang, et al., “Time-overlapping structured-light projection: high performance on 3D shape measurement for complex dynamic scenes,” Opt. Express 30(13), 22467–22486 (2022). [CrossRef]  

23. B. Pan, “Digital image correlation for surface deformation measurement: historical developments, recent advances and future goals,” Meas. Sci. Technol. 29(8), 082001 (2018). [CrossRef]  

24. M. Palanca, G. Tozzi, and L. Cristofolini, “The use of digital image correlation in the biomechanical area: a review,” Int. Biomech. 3(1), 1–21 (2016). [CrossRef]  

25. L. Yang, C. Hou, W. Zhu, et al., “Monitoring the failure process of cemented paste backfill at different curing times by using a digital image correlation technique,” Constr Build Mater. 346, 128487 (2022). [CrossRef]  

26. L. B. Andraju and G. Raju, “Damage characterization of CFRP laminates using acoustic emission and digital image correlation: Clustering, damage identification and classification,” Eng. Fract. Mech. 277, 108993 (2023). [CrossRef]  

27. C. J. Tay, C. Quan, T. Wu, et al., “Integrated method for 3-D rigid-body displacement measurement using fringe projection,” Opt. Eng 43(5), 1152–1159 (2004). [CrossRef]  

28. L. Felipe-Sesé and F. A. Díaz, “Damage methodology approach on a composite panel based on a combination of Fringe Projection and 2D Digital Image Correlation,” Mech Syst Signal Process 101, 467–479 (2018). [CrossRef]  

29. H. Shi, H. Ji, G. Yang, et al., “Shape and deformation measurement system by combining fringe projection and digital image correlation,” Opt. Lasers Eng. 51(1), 47–53 (2013). [CrossRef]  

30. T. N. Nguyen, J. M. Huntley, R. L. Burguete, et al., “Shape and displacement measurement of discontinuous surfaces by combining fringe projection and digital image correlation,” Opt. Eng 50(10), 101505 (2011). [CrossRef]  

31. Z. Wu, W. Guo, B. Pan, et al., “A DIC-assisted fringe projection profilometry for high-speed 3D shape, displacement and deformation measurement of textured surfaces,” Opt. Lasers Eng. 142, 106614 (2021). [CrossRef]  

32. P. Siegmann, V. Álvarez-Fernández, F. Díaz-Garrido, et al., “A simultaneous in-and out-of-plane displacement measurement method,” Opt. Lett 36(1), 10–12 (2011). [CrossRef]  

33. L. Felipe-Sese, P. Siegmann, F. A. Diaz, et al., “Simultaneous in-and-out-of-plane displacement measurements using fringe projection and digital image correlation,” Opt. Lasers Eng. 52, 66–74 (2014). [CrossRef]  

34. Z. Wu, W. Guo, Z. Chen, et al., “Three-dimensional shape and deformation measurement on complex structure parts,” Sci. Rep 12(1), 7760 (2022). [CrossRef]  

35. Z. Wu, Z. Chen, Z. Chen, et al., “Chromatic DIC-Assisted Fringe Projection Profilometry for Shape, Deformation and Strain Measurement with Intensity-Chroma Space Analysis,” IEEE Trans Instrum Meas 72, 1–13 (2023). [CrossRef]  

36. J. Li, L. G. Hassebrook, and C. Guan, “Optimized two-frequency phase-measuring-profilometry light-sensor temporal-noise sensitivity,” J. Opt. Soc. Am. A 20(1), 106–115 (2003). [CrossRef]  

37. N. Lyu, H. Yu, X. Xu, et al., “Structured light 3-D sensing for scenes with discontinuous reflectivity: error removal based on scene reconstruction and normalization,” Opt. Express 31(12), 20134–20149 (2023). [CrossRef]  

38. B. Pan, H. Xie, Z. Wang, et al., “Study on subset size selection in digital image correlation for speckle patterns,” Opt. Express 16(10), 7037–7048 (2008). [CrossRef]  

39. Z. Hu, T. Xu, X. Wang, et al., “Fluorescent digital image correlation techniques in experimental mechanics,” Sci. China Technol. Sci. 61(1), 21–36 (2018). [CrossRef]  

40. Y. Dong and B. Pan, “A review of speckle pattern fabrication and assessment for digital image correlation,” Exp Mech 57(8), 1161–1181 (2017). [CrossRef]  

41. N. H. El-Farra and P. D. Christofides, “Bounded robust control of constrained multivariable nonlinear processes,” Chem. Eng. Sci. 58(13), 3025–3047 (2003). [CrossRef]  

42. B. Bayer, “Color imaging array,” United States Pat no. 3971065 (1976).

43. B. Pan, H. Xie, and Z. Wang, “Equivalence of digital image correlation criteria for pattern matching,” Appl. Opt. 49(28), 5501–5509 (2010). [CrossRef]  

44. H. Bruck, S. McNeill, M. A. Sutton, et al., “Digital image correlation using Newton-Raphson method of partial differential correction,” Exp. Mech. 29(3), 261–267 (1989). [CrossRef]  

45. H. Hou and H. Andrews, “Cubic splines for image interpolation and digital filtering,” IEEE Trans. Acoust., Speech, Signal Process. 26(6), 508–517 (1978). [CrossRef]  

46. B. Pan, H. Xie, Z. Guo, et al., “Full-field strain measurement using a two-dimensional Savitzky-Golay digital differentiator in digital image correlation,” Opt. Eng 46(3), 033601 (2007). [CrossRef]  

47. Z. Wu, W. Guo, and Q. Zhang, “Two-frequency phase-shifting method vs. Gray-coded-based method in dynamic fringe projection profilometry: A comparative review,” Opt. Lasers Eng. 153, 106995 (2022). [CrossRef]  

48. B. Pan, L. Yu, Y. Yang, et al., “Full-field transient 3D deformation measurement of 3D braided composite panels during ballistic impact using single-camera high-speed stereo-digital image correlation,” Compos. Struct. 157, 25–32 (2016). [CrossRef]  

Supplementary Material (2)

NameDescription
Visualization 1       dynamic multi-dimensinonal measurement of the composite braided structure
Visualization 2       Dynamic multi-dimensional measurement of the damping structure

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (11)

Fig. 1.
Fig. 1. Principle of DIC-assisted-FPP system.
Fig. 2.
Fig. 2. Accuracy requirements for 3D shape measurement systems. (a) Reconstruction results of color cards and throw lines. (b) Optical path diagram for discontinuous reflectivity conditions. (c) Table tennis ball with different textures and the corresponding reconstruction results.
Fig. 3.
Fig. 3. Comparison of information accessibility between (a) traditional FPP measurement system and (b) FPP-based multi-dimensional information sensing system.
Fig. 4.
Fig. 4. Schematic diagram of the general reflectivity gradient modulation method.
Fig. 5.
Fig. 5. Optimization of speckle-free coefficients F and extraction coefficients E.
Fig. 6.
Fig. 6. Calculation process for multi-dimensional information.
Fig. 7.
Fig. 7. Accuracy evaluation of 3D shape on table tennis (a) Captured patterns. (b) Intensity extraction method and proposed method. (c) Profile line of the fringes. (d)-(e) 3D reconstruction and fitting results using speckle-free coefficients. (f) 3D reconstruction and fitting results using intensity extraction coefficients.
Fig. 8.
Fig. 8. Evaluation of the deformation accuracy of this measuring system under the extraction coefficients. (a) Measuring positions of the standard flat and a reflectivity gradient modulation result of speckles. (b) X, Y, and Z displacement measurements when the movement in three directions is 10 mm. (c) Displacement curve of the middle row. (d) Errors between the displacements and their fitted planes at nine positions.
Fig. 9.
Fig. 9. Main view and vertical view of the experimental device.
Fig. 10.
Fig. 10. Measured results of braided composite structure (a) Captured temporal overlapping encoding (3 + 2) patterns (b) Information separation result. (c) Multi-dimensional information. (d)-(f) Strain maps (εxx, εxy, and εyy) at five moments. (g)-(h) Detail amplification of εxx and εyy at 85.30 ms. (Visualization 1)
Fig. 11.
Fig. 11. Measurement result of damping structure. (a) Actual scenario and information analysis. (b) Measurement result of shape. (c) Strain maps at corresponding moments. (Visualization 2)

Tables (1)

Tables Icon

Table 1. Deformation errors (ΔD) and RMSE of the nine positions

Equations (15)

Equations on this page are rendered with MathJax. Learn more.

I i ( x , y ) = R ( x , y ) A ( x , y ) + R ( x , y ) B ( x , y ) cos [ φ ( x , y ) δ i ] + Δ n i , i = 1 , 2 , 3 ,
ϕ ( x , y ) = artan [ i = 1 N ( I i ( x , y ) sin δ i ) i = 1 N ( I i ( x , y ) cos δ i ) ] .
σ 2 Δ φ = 2 N σ n o i s e 2 R 2 B 2 .
Δ ϕ R ( x , y ) = tan 1 [ R ( u 2 , v 2 ) R ( u 1 , v 1 ) R ( u 2 , v 2 ) + R ( u 1 , v 1 ) e r f ( 2 π f σ ) ] ,
σ 2 Δ d i s = σ n o i s e 2 B 2 ( R x y ) 2 .
( I f b I e b I f s I e s ) = ( I r b I g b I b b I r s I g s I b s ) ( F 1 E 1 F 2 E 2 F 3 E 3 ) .
{ M I = 0.299 I r s + 0.587 I g s + 0.114 I b s D C = ( I r b I r s ) 2 + ( I g b I g s ) 2 + ( I b b I b s ) 2 I r s , I g s , I b s , I r b , I g b , I b b [ 0 , 255 ] .
{ Objective Function : max ( D C ( I r s , I g s , I b s ) + M I ( I r s , I g s , I b s ) ) Bound Constraints Conditions : I r s , I g s , I b s [0, 255] .
{ I f b = I f s = F 1 × I r b + F 2 × I g b + F 3 × I b b = F 1 × I r s + F 2 × I g s + F 3 × I b s Δ I = E 1 × Δ I r  +  E 2 × Δ I g  +  E 3 × Δ I b = E 1 × ( I r b I r s )  +  E 2 × ( I g b I g s )  +  E 3 × ( I b b I b s ) ,
{ σ K = | F 1 × σ r | + | F 2 × σ g | + | F 3 × σ b | = ( 2 F 1 + F 2 + 2 F 3 ) × σ g f ( F 1 , F 2 , F 3 ) = ( I f b σ F ) max σ T = | E 1 × σ r | + | E 2 × σ g | + | E 3 × σ b | = ( 2 E 1 + E 2 + 2 E 3 ) × σ g f ( E 1 , E 2 , E 3 ) = ( Δ I σ E ) max F 1 , F 2 , F 3 , E 1 , E 2 , E 3 ( 0 , 1 ) ,
M ( x , y ) = 2 3 [ i = 1 3 I i ( x , y ) sin ( 2 π i / 3 ) ] 2 + [ i = 1 3 I i ( x , y ) cos ( 2 π i / 3 ) ] 2 .
C Z N S S D ( P ) = [ F ( x , y ) F ¯ Ω ( F ( x , y ) F ¯ ) 2 G ( x , y ) G ¯ Ω ( G ( x , y ) G ¯ ) 2 ] 2 ,
{ x = x + u + u x ( x x 0 ) + u y ( y y 0 ) y = y + v + v x ( x x 0 ) + v y ( y y 0 ) .
{ U ( x , y ) = X ( x , y , t 2 ) X ( x , y , t 1 ) V ( x , y ) = Y ( x , y , t 2 ) Y ( x , y , t 1 ) W ( x , y ) = Z ( x , y , t 2 ) Z ( x , y , t 1 ) .
ε x x = Δ x x , ε y y = Δ y y , ε x y = ε y x = 1 2 ( Δ x y + Δ y x ) ,
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.