Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Local orientation coherence based segmentation and boundary-aware diffusion for discontinuous fringe patterns

Open Access Open Access

Abstract

Fringe patterns with noise and discontinuity are often encountered but difficult to analyze. Discontinuity-detectable and boundary-aware processing techniques are demanded. A local orientation coherence based fringe segmentation (LOCS) method and its cooperation with boundary-aware coherence enhancing diffusion (BCED) for discontinuous fringe pattern denoising are proposed in this paper. The LOCS method has three steps. First, as orientation coherence indicated by structure tensors is informative to describe fringe structures, it is selected for discontinuity recognition. Due to the complexity of the discontinuity problem, the detected boundary often has missing parts and is not very accurate. Boundary completion by cubic splines and boundary refinement based on partial structure tensors are further performed as the second and third steps, respectively. Subsequently, the BCED method is developed to adapt the original CED to fringe segments with irregular boundaries. Simulated and experimental fringe patterns are tested and successful results have been obtained.

© 2016 Optical Society of America

Corrections

3 August 2016: A correction was made to the author affiliations.

1. Introduction

Optical interferometric techniques are well-known for their non-contact, highly sensitive and full-field measurement capabilities, and are thus attractive in both research and engineering fields, such as mechanical engineering and material engineering [1]. Fringe patterns are common results of optical interferometric techniques, from which phases need to be retrieved. Phase retrieval from a single closed fringe pattern, especially with the presence of noise, either additive or speckle, is difficult [2]. Meanwhile, with the fast development of manufacturing techniques, a specimen to be tested often consists of multiple parts. Dealing with discontinuity becomes a demanding and urgent task [3–9].

The difficulty of discontinuity identification is worth highlighting. As we know, discontinuity in a general image is usually caused by a sudden change of image intensity. In a fringe pattern such as the one in Fig. 1(a), the intensities on both sides of the discontinuity are waving (because of the waving the image is called a fringe pattern), thus the intensities on two sides of discontinuity may be the same. This is not surprising because the discontinuity is in fact resulted from phase instead of intensity. Current popular and powerful image segmentation techniques, such as graph cut [10] and level set [11] methods, cannot work well directly on fringe intensities. Texture segmentation methods [12] seem to be applicable to fringe patterns. However, texture is often structured and repeated while a fringe pattern has irregular structures.

 figure: Fig. 1

Fig. 1 LOCS method. (a) A noisy fringe pattern; (b) the histogram of λ2; (c) the discontinuous region; (d) the discontinuous boundary; (e) the complete boundary shown in white and the ground truth shown in red for comparison; (f) the refined boundary shown in white and the ground truth shown in red for comparison.

Download Full Size | PDF

The analysis of fringe patterns with both noise and discontinuity problems is thus challenging. To deal with noise, various noise suppression techniques [13–15] and noise robust phase retrieval and unwrapping techniques [16–18] have been developed, most of which however do not consider the discontinuity problem and are not directly applicable to discontinuous fringe patterns. Recently, discontinuity has caught attentions in fringe analysis. For denoising, windows Fourier filtering (WFF) and block-matching and 3D filtering (BM3D) are used to treat the continuous and discontinuous regions of the fringe patterns respectively, which best utilize capabilities of WFF and BM3D [3]. However, with increasing noise level, the simple thresholding used to perform the segmentation soon becomes unreliable. For phase retrieval, second-order robust regularization cost function [4] has been used to estimate phase pixel by pixel by minimizing an edge-preserving regularized cost function. Hilbert assisted wavelet transform [5] and shearlet transform [6] are used for better localization in spatial domain to reduce the phase extraction error at the phase discontinuous points. For phase unwrapping, a half quadratic cost function is proposed for phase unwrapping considering convex and nonconvex cases [7]. A residue check principle and a Laplace phase derivative variance quality map are combined to perform phase unwrapping of discontinuous objects [8]. Snake assisted quality-guided phase unwrapping [9] incorporates the GVF snake model to help the piece-wise unwrapping process. In the above works, some techniques do not take noise into account [4–6, 8] and thus the processing may be influenced or even fail due to heavy noise. A clear boundary is not discussed except for [9] which however requires human interactions to get better results.

We aim to develop a method to solve both the noise and discontinuity problems in fringe pattern analysis. The method is expected to have the following features. First, it is generic so that the state-of-the-art denoising, phase extraction and phase unwrapping methods can fit into it. Second, denoising is selected to cooperate with this method as instantiation for feasibility examination since denoising is usually the first and critical step for fringe analysis. Third, the method is robust and accurate to make the fringe analysis reliable and automatic.

A local orientation coherence based fringe segmentation (LOCS) method and a boundary-aware coherence enhancing diffusion (BCED) method [13] for discontinuous fringe patterns are proposed. By realizing the fact that most discontinuities are formed due to multiple pieces in measurement fields, we choose to segment these pieces as a straightforward and effective approach for general fringe analysis tasks. Due to the complicated but informative fringe structures, orientation coherence indicated by local structure tensors is used for discontinuity recognition, followed by boundary completion and refinement for better accuracy. The LOCS method is presented in Sect. 2. Subsequently, BCED is developed by adapting the original coherence enhancing diffusion (CED) to make it boundary-aware, which is presented in Sect 3. The performance of the proposed LOCS and BCED methods are demonstrated and discussed in Sect. 4. The paper is concluded in Sect. 5.

2. The local orientation coherence based fringe segmentation (LOCS)

Fringe pattern has its own special structure characteristic, the flow like structure, which is continuous and only interrupted by discontinuous boundaries. Orientation field is often used to indicate the flow. A sudden change of orientation is an indication of discontinuous boundaries. To suppress the noise influence, the orientation is usually estimated in a small block, where a single orientation is considered to exist within the block. However, this is not true when the block crosses a boundary, resulting in unreliable orientation estimation and affecting the subsequent segmentation. Comparing with orientation values, orientation coherence in the block can better reflect the structure through eigenvalue analysis.

Given a fringe pattern f(x,y), a structure tensor for a block centered at pixel (x, y) is constructed as

S(x,y)=[u,vω(u,v)fxσ2(u,v)u,vω(u,v)fxσ(u,v)fyσ(u,v)u,vω(u,v)fxσ(u,v)fyσ(u,v)u,vω(u,v)fyσ2(u,v)],
where (u, v) denote the coordinates of neighboring pixels of (x, y); ω(u,v) is a fixed window function which is set to a unit value; fxσ and fyσ are first order derivatives of f after diffused by a Gaussian with a kernel size of σ. The eigenvalues of this matrix, λ1 and λ2, can be easily calculated [19]. If the fringe in the block is continuous, there is only one dominant orientation, i.e. λ1>0,λ20; while if the block includes a discontinuity, there are at least two dominant orientations, i.e. λ1>λ2>0. Thus λ2 can be used for discontinuity detection as follows,
M(x,y)=[λ2(x,y)>thr],
where 1 and 0 in M (x, y) indicate discontinuity and continuity, respectively; thr is a threshold whose value is yet to be determined.

Automatic determination of thr value is important and can be achieved by analyzing the histogram of λ2. As an example, for a fringe pattern with additive noise shown in Fig. 1(a), the histogram of λ2 is shown in Fig. 1(b). We express the histogram as a function h(i) where i is the bin index with N bins in total. As observed from Fig. 1(b), a concentration of small values corresponding to the continuous pixels and a wide spread of large values corresponding to discontinuities are presented. The threshold thr can thus be set at the downward turning bin with the largest variation rate, which can be represented in terms of a second-order derivative

thr0=argmaxt[δhf(t)δhb(t)],
whereδhf(t)=h(t+1)h(t) and δhb(t)=h(t)h(t1) are forward and backward differences, respectively. However, though the overall distribution of the histogram is clear, there are unavoidable deviations for individual bins. Thus h(t+1) is replaced by the average of h values from t+1iN, and h(t1) is replaced by that from 1it1 to reflect the overall histogram distribution. The threshold thr0 works well, but there is a probability that some pixels due to noise still survive the thresholding. To increase the robustness, we create a series of 10 thresholds as thr0<thr1<<thr9 with an increment empirically defined as 0.004max(λ2) between two consecutive thresholds. The largest threshold thr9 is used first and the result is considered as the true discontinuity. Starting from thr8, newly discovered boundary pixels by thrj are considered as part of the discontinuous region only when they are connected to true discontinuity identified previously by thrj+1. The final result contains discontinuity regions instead of discontinuous boundaries as shown in Fig. 1(c), which is thinned by a morphological thinning, as in Fig. 1(d).

As highlighted by the blue circles in Fig. 1, when the fringes along the discontinuous boundaries have similar flow orientations, the discontinuities cannot be identified. To predict the missing curves which are assumed to be smooth, interpolation using a cubic spline [20] based on the already estimated boundary pixels is proposed. Taking x coordinates x(k) as an example, a cubic spline used to represent a boundary is written as follows

x(k)=l=1Lϖl(k)Pxl,
where k[0,1] is a variable to indicate different locations of the curve points with 0 for the start point and 1 for the end point; Pxl are x coordinates of the control points; ϖl(k) is a known piecewise polynomial function [20]. The basic task is to find proper control points, which is achieved by fitting the curve in Eq. (4) to the already determined partial boundaries in the cubic Heimite form [20]. Denote all known boundary pixels in order as {(xi,yi):i[0,m]}. Before fitting, the k value of a boundary pixel (xi,yi) is computed as ki=t=1:idt/t=1:mdt, where {d0=0,di=[(xixi1)2+(yiyi1)2]1/2}. After fitting, the k value is uniquely and densely sampled from 0 to 1 to form the whole spline curve. The same fitting technique is then applied to y coordinates. The interpolated spline curve is shown in Fig. 1(e), which naturally divides the image into two segments: the inner one denoted as A and the outer one denoted as B. The boundary pixels of A are represented by the above spline curve while the boundary pixels of B are estimated along the spline curve.

Both the structure tensor analysis and the morphological thinning, to some extent, are qualitative. As a consequence, the boundary obtained so far is not optimal and a further improvement is required. To achieve this goal, the imperfect boundary pixels should be detected and then refined. We propose to use a partial structure tensor Sp(x,y) by setting ω(u,v) in Eq. (1) to 0 if (u, v) and (x, y) belong to different segments. For true boundary pixels, their second eigenvalues, λ2p, are near zero because the pixels involved in the partial structure calculation mainly belong to one segment, while for pixels deviated from true boundaries, their second eigenvalues become larger as the pixels involved in the partial structure calculation come from both segments. These two cases can be captured by re-evaluating the boundary mask Mp according to Eq. (2), based on λ2p and thr9. Mp is then used to push/pull the control points to refine the spline curve. For every point xA in the spline curve belonging to segment A and its nearest boundary point xB belonging to segment B, we push/pull the control points as

Pxl'=Pxl+0.15sign(xAxB)[Mp(xA,yA)Mp(xB,yB)]ϖl(kxA),
where ϖl(kxA) is the polynomial function of point xA as in Eq. (4); sign() is 1 for positive input and −1 for negative input; sign(xAxB) indicates the push/pull direction; the value 0.15 is determined so that the adjustment is at most 1 boundary pixel. Py are adjusted similarly. Subsequently a new boundary is generated. This refinement is repeated several times until the control points no longer change and the boundary is stabilized and optimized. The refined result of Fig. 1(e) is shown in Fig. 1(f) with the most significant refinements pointed by the arrows. The LOCS method is generally insensitive to noise, but if the noise is extremely heavy, denoising using methods such as CED is performed prior to it.

3. The boundary-aware coherence enhancing diffusion (BCED)

After the segmentation, the fringe pattern is divided into segments and denoising can be performed in each segment. The denoising algorithm of particular interest is CED due to its excellent performance [13]. However, CED does not perform well for a segment with irregular boundaries and thus needs to be modified. CED is an iterative and oriented denoising method as

f(x,y;t+1)=f(x,y;t)+ε×ft(x,y;t),f(x,y;0)=f0(x,y),
where f(x,y;t) denotes the image intensity of a fringe pattern at pixel (x,y) and iteration t; ε is a positive constant; symbol × denotes multiplication; f0(x,y) is the original noisy fringe pattern; ft(x,y;t) is defined as
ft=(Df)
with
D=χ1[sin2θsinθcosθsinθcosθcos2θ]+χ2[cos2θsinθcosθsinθcosθsin2θ],
where is a divergence operator; is a gradient operator; θ(π/2,π/2] is the fringe orientation; D is a diffusion tensor which controls both orientations and strengths of the diffusion. By changing the values of χ1 and χ2, the diffusion strengths in the directions perpendicular and parallel to the fringe orientation are adjusted, respectively.

The discretization scheme of Eq. (7) can be represented as [21]

ft(x,y)=i=11j=11a(i,j)f(x+i,y+j),
where a(i,j) can be computed according to Eqs. (7) and (8) and thus known. It can be seen from Eq. (9) that, to perform CED on a pixel, its eight surrounding neighbors should exist, which is unfortunately not true for boundary pixels of a segment. An oriented boundary padding method has been used to solve the boundary problem of iterative processing [22], which is however time consuming due to matrix inverse operations.

CED is performed along two fringe orientations θ and θ+π/2, as indicated in Fig. 2(a), which means that the diffusion of CED can be equivalently divided into four directions, ϑ, ϑ+π/2, ϑ+πand ϑ+3π/2 with ϑ[0,π/2), as shown in Fig. 2(b). Taking ϑ direction as an example, we zero all coefficients in Eq. (9) except for the relevant ones in the first quadrant, a(0,1), a(1,0), a(1,1) and modify Eq. (9) into

ft(ϑ)(x,y)=a(1,0)[f(x+1,y)f(x,y)]+a(0,1)[f(x,y+1)f(x,y)]+2a(1,1)[f(x+1,y+1)f(x,y)]
The other directions can be estimated similarly.

 figure: Fig. 2

Fig. 2 BCED principle. (a) Fringe orientations; (b) fringe directions.

Download Full Size | PDF

Since the amount of diffusion along perpendicular orientation is very small, to diffuse a boundary pixel, among these four directions, only the one along the fringe and has the required data available will be chosen for diffusion. The summation of these four directions is twice of ft in Eq. (9), so that the amount of diffusion from one side of the boundary will be the same as non-boundary pixels. BCED works by executing Eq. (6), where ft is determined by Eq. (9) for non-boundary pixels, while it is determined by Eq. (10) or similar modifications depending on the boundary structure for boundary pixels. The BCED together with LOCS can be applied to denoise discontinuous fringe patterns. As shown in Fig. 3, the boundary of denoising result using BCED is well preserved compared to the result using CED.

 figure: Fig. 3

Fig. 3 Denoising results of Fig. 1(a). (a) CED denoising result; (b) BCED denoising result.

Download Full Size | PDF

4. Results and discussions

A Fringe pattern (256 × 256) containing both linear and circular fringe segments is simulated with irregular circular boundary, as shown in Fig. 4(a). The amplitude of the simulated fringe pattern is set to [-1, 1]. Additive noise of a mean of 0 and standard deviation values (STD) of 0, 0.5, 1.0, 1.5 and speckle noise are then added. The latter two are shown in Figs. 4(b) and 4(c), respectively. For STD larger than 0.5 and speckle noise, CED needs to be performed prior to the segmentation. Figure 5 shows the segmentation results of Fig. 4 with white lines as the identified boundaries and the red lines as the ground truth. The results are seen satisfactory. Quantitatively, for fringe patterns with additive noise with STDs of 0, 0.5, 1.0, 1.5 and speckle noise, the mean absolute errors are 0.40, 0.93, 0.95, 0.99 and 0.99 pixels, respectively, and the largest errors are 3, 5, 6, 6 and 6 pixels respectively.

 figure: Fig. 4

Fig. 4 Simulated fringe patterns. (a) Noiseless; (b) additive noise (STD 1.5); (c) speckle noise.

Download Full Size | PDF

 figure: Fig. 5

Fig. 5 Segmentation results. (a) Noiseless; (b) additive noise (STD 1.5); (c) speckle noise.

Download Full Size | PDF

Another Fringe pattern (256 × 256) containing both linear and circular fringe segments is simulated with regular linear boundary, as shown in Fig. 6(a). Same levels of additive noise and speckle noise as last example are added, with the fringe patterns with STD = 1.5 and speckle noise shown in Figs. 6(b) and 6(c), respectively. Figure 7 shows the segmentation results of Fig. 6 with white lines as the identified boundaries and the red lines as the ground truth, which are consistent with the examples in Figs. 4 and 5. Quantitatively, for fringe patterns with additive noise with STDs of 0, 0.5, 1.0, 1.5 and speckle noise, the mean absolute errors are 0.30, 0.46, 0.80, 1.08 and 1.01 pixels, respectively, and the largest errors are 1, 4, 5, 5 and 6 pixels respectively.

 figure: Fig. 6

Fig. 6 Simulated fringe patterns. (a) Noiseless; (b) additive noise (STD 1.5); (c) speckle noise.

Download Full Size | PDF

 figure: Fig. 7

Fig. 7 Segmentation results. (a) Noiseless; (b) additive noise (STD 1.5); (c) speckle noise.

Download Full Size | PDF

The same fringe patterns with additive or speckle noise in Figs. 4 and 6 are then denoised using BCED and BM3D, respectively. BM3D is chosen for comparison due to its non-local filtering strategy and its capability to reserve discontinuities. It is a novel image denoising strategy that groups similar 2D image fragments into 3D data arrays as groups and uses collaborative filtering to deal with these 3D groups. By attenuating the noise, the collaborative filtering reveals details shared by grouped blocks and at the same time it preserves the essential unique features of each individual block [3, 23]. The parameters of BCED are set as follows ε=0.3, χ1=0.005 and χ2=1 for all examples. For fringe patterns with additive noise with STDs of 0.5, 1.0, 1.5 and speckle noise, 100, 150, 200 and 200 iterations are performed for BCED. The results of Figs. 4 and 6 with the proposed BCED and the BM3D methods are shown in Figs. 8 and 9, respectively. BCED produces smoother results. The mean absolute errors (MAE) between the denoising results after linearly scaled into [-1, 1] and ground truth are listed in Table 1. Compared with BM3D, the BCED has similar performance when the noise is light and better performance when the noise is severe. For the LOCS method, the computation time using MATLAB in Dell Precision Tower 7910 with Intel® Xeon® CPU E5-2630@2.4 GHz and 32.0GB RAM is around 0.5s for all examples. Meanwhile, for the BCED method, the computation time is proportional to the iteration number, which is about 6s for 100 iterations.

 figure: Fig. 8

Fig. 8 Denoising results. (a) Result of Fig. 4(b) with BCED; (b) result of Fig. 4(b) with BM3D; (c) result of Fig. 4(c) with BCED; (d) result of Fig. 4(c) with BM3D.

Download Full Size | PDF

 figure: Fig. 9

Fig. 9 Denoising results. (a) Result of Fig. 6(b) with BCED; (b) result of Fig. 6(b) with BM3D; (c) result of Fig. 6(c) with BCED; (d) result of Fig. 6(c) with BM3D.

Download Full Size | PDF

Tables Icon

Table 1. Mean absolute errors of denoising results

Experimental fringe patterns are tested to verify the proposed LOCD and BCED methods. A fringe pattern from fringe projection profilometry is shown in Fig. 10(a), where similar fringe pattern with slight displacement is presented. Another fringe pattern from phase-shifting electronic speckle pattern shearing interferometer is shown in Fig. 11(a), where about half of the fringe pattern is occupied with random values. The identified discontinuous boundaries of Figs. 10(a) and 11(a) are shown in Figs. 10(b) and 11(b), respectively. The denoising results are shown in Figs. 10(c) and 11(c), where no blurring is introduced in boundary regions. The experimental results are consistent with the simulated results.

 figure: Fig. 10

Fig. 10 Results of an experimental fringe pattern from fringe projection profilometry. (a) An experimental fringe pattern; (b) LOCS result; (c) BCED result.

Download Full Size | PDF

 figure: Fig. 11

Fig. 11 Results of an experimental fringe pattern from phase-shifting electronic speckle pattern shearing interferometer. (a) An experimental fringe pattern; (b) LOCS result; (c) BCED result.

Download Full Size | PDF

5. Conclusion

In this paper, a local orientation coherence based segmentation (LOCS) method to identify discontinues boundaries, and a boundary-aware coherence enhancing diffusion (BCED) method to adapt CED for denoising fringe segments with irregular boundaries, are proposed. Both LOCS and BCED are able to work efficiently and automatically even for very noisy fringe patterns. Further, this segmentation based approach provides a general framework to incorporate other phase retrieval and phase unwrapping techniques. Both simulated and experimental results verify the proposed method.

6. Funding

National Natural Science Foundation of China (NSFC) (61527808), Natural Science Foundation of Zhejiang Province (LY14F020014); Singapore Academic Research Fund Tier 1 (RG28/15).

References and Links

1. D. W. Robinson and G. T. Reid, Interferogram Analysis: Digital Fringe Pattern Measurement Techniques (Institute of Physics, 1993).

2. Q. Kemao, Windowed Fringe Pattern Analysis (SPIE, 2013).

3. M. Zhao and K. Qian, “WFF-BM3D: a hybrid denoising scheme for fringe patterns,” Proc. SPIE 9524, 952423 (2015). [CrossRef]  

4. C. Galvan and M. Rivera, “Second-order robust regularization cost function for detecting and reconstructing phase discontinuities,” Appl. Opt. 45(2), 353–359 (2006). [CrossRef]   [PubMed]  

5. S. Li, X. Su, and W. Chen, “Hilbert assisted wavelet transform method of optical fringe pattern phase reconstruction for optical profilometry and interferometry,” Optik (Stuttg.) 123(1), 6–10 (2012). [CrossRef]  

6. B. Li, C. Tang, X. Zhu, Y. Su, and W. Xu, “Shearlet transform for phase extraction in fringe projection profilometry with edges discontinuity,” Opt. Lasers Eng. 78, 91–98 (2016). [CrossRef]  

7. M. Rivera and J. L. Marroquin, “Half-quadratic cost functions for phase unwrapping,” Opt. Lett. 29(5), 504–506 (2004). [CrossRef]   [PubMed]  

8. H. Cui, W. Liao, N. Dai, and X. Cheng, “Reliability-guided phase-unwrapping algorithm for the measurement of discontinuous three-dimensional objects,” Opt. Eng. 50(6), 063602 (2011). [CrossRef]  

9. M. Zhao, H. Wang, and Q. Kemao, “Snake-assisted quality-guided phase unwrapping for discontinuous phase fields,” Appl. Opt. 54(24), 7462–7470 (2015). [CrossRef]   [PubMed]  

10. W. Ju, D. Xiang, B. Zhang, L. Wang, I. Kopriva, and X. Chen, “Random walk and graph cut for co-segmentation of lung tumor on PET-CT images,” IEEE Trans. Image Process. 24(12), 5854–5867 (2015). [CrossRef]   [PubMed]  

11. V. Estellers, D. Zosso, S. Rongjie Lai, J. Osher, Thiran, and X. Bresson, “Efficient algorithm for level set method preserving distance function,” IEEE Trans. Image Process. 21(12), 4722–4734 (2012). [CrossRef]   [PubMed]  

12. J. Chen, G. Zhao, M. Salo, E. Rahtu, and M. Pietikäinen, “Automatic dynamic texture segmentation using local descriptors and optical flow,” IEEE Trans. Image Process. 22(1), 326–339 (2013). [CrossRef]   [PubMed]  

13. H. Wang, Q. Kemao, W. Gao, F. Lin, and H. S. Seah, “Fringe pattern denoising using coherence-enhancing diffusion,” Opt. Lett. 34(8), 1141–1143 (2009). [CrossRef]   [PubMed]  

14. C. Tang, L. Wang, H. Yan, and C. Li, “Comparison on performance of some representative and recent filtering methods in electronic speckle pattern interferometry,” Opt. Lasers Eng. 50(8), 1036–1051 (2012). [CrossRef]  

15. S. Fu and C. Zhang, “Fringe pattern denoising via image decomposition,” Opt. Lett. 37(3), 422–424 (2012). [CrossRef]   [PubMed]  

16. J. C. Estrada, M. Servin, and J. A. Quiroga, “Noise robust linear dynamic system for phase unwrapping and smoothing,” Opt. Express 19(6), 5126–5133 (2011). [CrossRef]   [PubMed]  

17. I. Iglesias, “Phase estimation from digital holograms without unwrapping,” Opt. Express 22(18), 21340–21346 (2014). [CrossRef]   [PubMed]  

18. Z. Cheng, D. Liu, Y. Yang, T. Ling, X. Chen, L. Zhang, J. Bai, Y. Shen, L. Miao, and W. Huang, “Practical phase unwrapping of interferometric fringes based on unscented Kalman filter technique,” Opt. Express 23(25), 32337–32349 (2015). [CrossRef]   [PubMed]  

19. B. Jahne, Spatio-Temporal Image Processing: Theory and Scientific Applications (Springer-Verlag, 1993).

20. L. Piegl and W. Tiller, The NURBS Book (Springer-Verlag, 1997).

21. J. Weickert, Anisotropic Diffusion in Image Processing (Teubner-Verlag, 1998).

22. H. Wang, Q. Kemao, R. Liang, H. Wang, and X. He, “Oriented boundary padding for iterative and oriented fringe pattern denoising techniques,” Signal Process. 102, 112–121 (2014). [CrossRef]  

23. K. Dabov, A. Foi, V. Katkovnik, and K. Egiazarian, “Image denoising by sparse 3-D transform-domain collaborative filtering,” IEEE Trans. Image Process. 16(8), 2080–2095 (2007). [CrossRef]   [PubMed]  

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (11)

Fig. 1
Fig. 1 LOCS method. (a) A noisy fringe pattern; (b) the histogram of λ 2 ; (c) the discontinuous region; (d) the discontinuous boundary; (e) the complete boundary shown in white and the ground truth shown in red for comparison; (f) the refined boundary shown in white and the ground truth shown in red for comparison.
Fig. 2
Fig. 2 BCED principle. (a) Fringe orientations; (b) fringe directions.
Fig. 3
Fig. 3 Denoising results of Fig. 1(a). (a) CED denoising result; (b) BCED denoising result.
Fig. 4
Fig. 4 Simulated fringe patterns. (a) Noiseless; (b) additive noise (STD 1.5); (c) speckle noise.
Fig. 5
Fig. 5 Segmentation results. (a) Noiseless; (b) additive noise (STD 1.5); (c) speckle noise.
Fig. 6
Fig. 6 Simulated fringe patterns. (a) Noiseless; (b) additive noise (STD 1.5); (c) speckle noise.
Fig. 7
Fig. 7 Segmentation results. (a) Noiseless; (b) additive noise (STD 1.5); (c) speckle noise.
Fig. 8
Fig. 8 Denoising results. (a) Result of Fig. 4(b) with BCED; (b) result of Fig. 4(b) with BM3D; (c) result of Fig. 4(c) with BCED; (d) result of Fig. 4(c) with BM3D.
Fig. 9
Fig. 9 Denoising results. (a) Result of Fig. 6(b) with BCED; (b) result of Fig. 6(b) with BM3D; (c) result of Fig. 6(c) with BCED; (d) result of Fig. 6(c) with BM3D.
Fig. 10
Fig. 10 Results of an experimental fringe pattern from fringe projection profilometry. (a) An experimental fringe pattern; (b) LOCS result; (c) BCED result.
Fig. 11
Fig. 11 Results of an experimental fringe pattern from phase-shifting electronic speckle pattern shearing interferometer. (a) An experimental fringe pattern; (b) LOCS result; (c) BCED result.

Tables (1)

Tables Icon

Table 1 Mean absolute errors of denoising results

Equations (10)

Equations on this page are rendered with MathJax. Learn more.

S ( x , y ) = [ u , v ω ( u , v ) f x σ 2 ( u , v ) u , v ω ( u , v ) f x σ ( u , v ) f y σ ( u , v ) u , v ω ( u , v ) f x σ ( u , v ) f y σ ( u , v ) u , v ω ( u , v ) f y σ 2 ( u , v ) ] ,
M ( x , y ) = [ λ 2 ( x , y ) > t h r ] ,
t h r 0 = arg max t [ δ h f ( t ) δ h b ( t ) ] ,
x ( k ) = l = 1 L ϖ l ( k ) P x l ,
P x l ' = P x l + 0 .15 s i g n ( x A x B ) [ M p ( x A , y A ) M p ( x B , y B ) ] ϖ l ( k x A ) ,
f ( x , y ; t + 1 ) = f ( x , y ; t ) + ε × f t ( x , y ; t ) , f ( x , y ; 0 ) = f 0 ( x , y ) ,
f t = ( D f )
D = χ 1 [ sin 2 θ sin θ cos θ sin θ cos θ cos 2 θ ] + χ 2 [ cos 2 θ sin θ cos θ sin θ cos θ sin 2 θ ] ,
f t ( x , y ) = i = 1 1 j = 1 1 a ( i , j ) f ( x + i , y + j ) ,
f t ( ϑ ) ( x , y ) = a ( 1 , 0 ) [ f ( x + 1 , y ) f ( x , y ) ] + a ( 0 , 1 ) [ f ( x , y + 1 ) f ( x , y ) ] + 2 a ( 1 , 1 ) [ f ( x + 1 , y + 1 ) f ( x , y ) ]
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.