Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Superior techniques for eliminating ring artifacts in X-ray micro-tomography

Open Access Open Access

Abstract

Synchrotron-based X-ray micro-tomography systems often suffer severe ring artifacts in reconstructed images. In sinograms the artifacts appear as straight lines or stripe artifacts. These artifacts are caused by the irregular response of a detecting system giving rise to a variety of observed types of stripes: full stripes, partial stripes, fluctuating stripes, and unresponsive stripes. The use of pre-processing techniques such as distortion correction or phase retrieval blurs and enlarges these stripes. It is impossible for a single approach to remove all types of stripe artifacts. Here, we propose three techniques for tackling all of them. The proposed techniques are easy to implement; do not generate extra stripe artifacts and void-center artifacts; and give superior quality on challenging data sets and in comparison with other techniques. Implementations in Python and a challenging data set are available for download.

© 2018 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

1. Introduction

In a parallel-beam tomography system, 2D projections (images) of a sample, formed by the penetration of parallel X-rays, are recorded at different angles over 180° using a 2D detector. Combining all projections in sequence for a single row forms a sinogram. Applying a reconstruction method [1] on this sinogram yields a reconstructed 2D slice of the sample. Any defects in the recording system which cannot be removed by the flat-field correction technique [2] result in stripe artifacts in the sinogram (Fig. 1(a)) which appear as ring artifacts in the reconstructed image (Fig. 1(b)). A detector usually consists of an X-ray fluorescent scintillator, a visible light optics system, and a digital camera chip. Defects can come from different parts of the detector including the non-linear response of the sensor chip, dust on the optics components (lens, mirrors…), and especially the scintillator, which is the major contributor. It is very challenging to fabricate scintillators with high quality, high resolution, and radiation tolerant for using at synchrotron facilities. High flux of X-rays in a long period can damage micro-structures of the scintillator, release particles from the surface of optics components (mirror, anodized layers…) which can be deposited on the surface of the scintillator and degrade its quality over time, as shown in Fig. 2. Although frequent replacement and effective cleaning is feasible, damage can still occur during a time-resolved experiment that cannot be interrupted.

 figure: Fig. 1

Fig. 1 Impact of defects in a detector system: (a) Stripe artifacts in the sinogram; (b) Ring artifacts in the reconstructed image.

Download Full Size | PDF

 figure: Fig. 2

Fig. 2 Degradation of a scintillator during an experiment: (a) At the beginning of the experiment; (b) After few days of use.

Download Full Size | PDF

A number of methods for removing stripe artifacts have been proposed. However, large data volume collected in the limited availability of beam time at synchrotron facilities dictates that only a few approaches having low computational cost are used in tomographic software packages [3–8]. These approaches can be classified into two categories: real space methods and Fourier space methods. The normalization-based methods [9] and regularization-based methods [10] both working in real space are easy to use, but only suitable for suppressing a certain type of stripe. The FFT-based methods [11] and wavelet-FFT-based methods [12] can work with more types of stripes, but are difficult to use with many parameters to adjust. Besides, they can significantly impact the quality of the image if the parameters are chosen incorrectly. All of the mentioned methods have two common disadvantages: they produce extra stripe artifacts if there are high-frequency edges in the sinogram image; and increasing their strength may erase the center area of the reconstructed images (void-center artifacts). Other post-processing methods [13], working on the reconstruction space, are not considered here because they belong to different class of methods. They may not be suitable for cleaning ring artifacts producing streak artifacts (Fig. 1(b)).

It is impossible for a single approach to remove all types of stripe artifacts. By investigating the brightness profiles of stripes we classify them into four types: partial stripes, full stripes, fluctuating stripes, and unresponsive stripes. They require different ways of cleaning in which large stripes need a separate treatment to reduce the side effect of cleaning methods. In this report, we propose three techniques to work on distinctively different types of stripe artifacts: one is to tackle the small-to-medium partial and full stripes; one is to remove large stripes; and one is to remove unresponsive stripes and fluctuating stripes. In the first approach, stripe artifacts are removed based on the idea of equalizing the intensity profiles of neighboring pixels. We introduce different versions of this approach depending on the way of retrieving the underlying response curves. In the second approach, large stripes are located first, then a strong removal tool is applied. However, only information in the stripes is replaced by the corrected sinogram. This helps to reduce the side effects of the strong removal tool. In this approach we introduce a reliable method of detecting stripes which is based on the combination of four basic processing tools: normalizing, sorting, fitting, and thresholding. In the last approach, unresponsive and fluctuating stripes are detected by a similar way used in the second approach. Then the correction is done by interpolation. Details of each approach are presented in section 2 with the subsection of classifying different types of stripe artifacts. Section 3 shows the performance of our proposal methods on challenging data and in comparison with other methods. All data were collected over time at the I12-JEEP beamline, Diamond Light Source using the settings of 53keV X-rays, 1800 projections, 3.2 µm pixel size, and 1000 m sample-detector distance [14]. They were processed using I12 in-house python codes [15–17] where the flat-field correction was applied before stripe artifacts were removed. The filtered-back projection (FBP) method [18,19] was used for reconstruction.

2. Method

2.1 Classifying types of stripe artifacts

By comparing the intensity profile of a defective pixel with its adjacent good pixel, we classify stripe artifacts into different types, as described in Table 1; where the intensity profile refers to the plot of the measured intensities against the angles or the grayscale profile of a column of a sinogram. Figure 3 shows the characteristic intensity profiles of different types of stripes which are extracted from the sinograms in Fig. 1(a) and Fig. 4.

Tables Icon

Table 1. Classification of stripes

 figure: Fig. 3

Fig. 3 Characteristic intensity profile of a defective pixel (red color) in comparison with an adjacent good pixel (blue color) for distinguishing the type of a stripe: (a) Unresponsive stripe; (b) Full stripe; (c) Partial stripe; (d) Fluctuating stripe.

Download Full Size | PDF

 figure: Fig. 4

Fig. 4 (a) Visual demonstration of the S2, S3, and S4 stripe. (b) Magnified view from the blue frame in (a) shows the fluctuating stripe. (c), (d), and (e) Ring artifacts caused by the S2, S3, S4 stripes, respectively.

Download Full Size | PDF

The S1 stripes may come from dead pixels of the sensor chip, light-blocking dusts (dark blobs in Fig. 2(a)) or damaged scintillator (bright blobs in Figs. 2(a) and (b)). They do not respond to the variation of incoming X-ray intensity, giving rise to stripes of constant brightness in the sinogram, as clearly visible in Fig. 1(a). Missing information in these stripes impacts the reconstructed image; the constant intensity gives rise to a prominent half-ring (in 180-degree tomography), and high-frequency edges of the stripes are enhanced by the ramp filter in the reconstruction process [19] and cause severe streak artifacts (Fig. 1(b)).

The S2, S3 and S4 stripes (Figs. 4(a) and (b)) originate from partially defective regions of the detector and are difficult to detect in the radiographs. Still, they yield ring artefacts in the reconstruction (Figs. 4(c)-4(e)) that hamper the analysis of the images. Figures 5(a) and (d) show a flat-field image obtained using a scintillator that has been significantly degraded after a long period of use. Defective regions, called blobs, are clearly visible. The flat-field correction of a single projection of a high-contrast object appears visually to be free of these blobs as seen in Figs. 5(b) and (e). However, a deeper analysis shows that this correction, which assumes a simple linear relationship between images with and without the sample, is insufficient. Here we implemented a simple technique to analyze the response of each pixel which helps to reveal a hidden map of defective regions of the detector. We acquired projections of an X-ray absorber at different thicknesses by using a glass plate rotated in the range of [0; 90°], then applied a linear fit to the logarithm of measured intensities against thicknesses for all pixels (from the Beer-Lambert’s law [20]). Analyzing the deviation from the linear fit of each pixel provides us a map of defective pixels as shown in Figs. 5(c) and (f). For a good detector, one expects to get a uniform map.

 figure: Fig. 5

Fig. 5 Demonstration of the difference between the flat-field image and the defective pixel map. (a) Flat-field image; (b) Flat-field correction of a sample projection; (c) Defective map showing the intercept values of the linear fit; (d) Magnified view from the rectangular box in (a); (e) Magnified view from the rectangular box in (b); (f) Magnified view from the rectangular box in (c) showing defective regions not visible in (d).

Download Full Size | PDF

This simple technique is useful to characterize a detector system. Unfortunately, the obtained defective map cannot be used directly for correcting the irregular response of each pixel. In practice, the responses of defective pixels depend on the dynamic range of received intensities and the impact of adjacent areas (i.e. light scattering of the scintillator) which are dictated by the shape and absorption characteristics of a sample. Figure 6 shows the tomographic slices of two samples, scanned under identical conditions, at the same row of the detector where the first sample (Fig. 6(a)) produces a much lower dynamic range of transmitted intensities than the second sample (Fig. 6(b)). As can be seen, there are ring artifacts in the second sample which do not occur in the first one. The presence of the extremely high-absorption object in the second sample creates very dark areas in the scintillator. At certain angles, the dark X-ray shadow lies over the scintillator defects, and the difference between the defective response and normal response becomes significant. Thus the ring artifacts are enhanced over that range of angles, giving rise to partial rings.

 figure: Fig. 6

Fig. 6 Occurrence of the ring artifacts depends on the dynamic range of the incoming intensities. (a) Reconstructed image from the sample giving a low dynamic range of transmitted intensities; (b) Reconstructed image of the same detector row from the sample giving a high dynamic range of transmitted intensities, which shows the partial rings (arrowed).

Download Full Size | PDF

2.2 Removing small-to-medium partial stripes and full stripes

From the characteristic intensity profiles of the S2 and S3 stripes, we found that the difference of the low-frequency component of the intensity profile between adjacent pixels is the main cause of the stripe artifacts of type S2 and S3. Exploiting this difference for removing stripes is the key idea of our approaches. To equalize the responses of adjacent pixels we can apply some types of smoothing filters along the horizontal direction of the sinogram. This approach, however, introduces void-center artifacts and blurs the reconstructed images. To avoid this problem, we need to extract the underlying response curve of each pixel which reveals the map of differences, then a smooth filter or a correction method can be applied. Table 2 presents three different approaches based on three different ways of retrieving the underlying response curves of the sinogram.

Tables Icon

Table 2. Algorithms for removing stripes of type S2 and S3

Although the implementation is different, the underlying ideas of algorithm 1 and 2 are similar. They both equalize the low-pass components of the intensity profiles of the adjacent pixels. The choice of the polynomial order in step 1 of algorithm1 and the size and type of a window in step 1 of algorithm 2 is flexible and depends on the complexity of sinogram features. We found that algorithm 1 should be used for a sinogram having a low dynamic range of intensities, i.e. its low-pass components can be represented by low order polynomial fits (1-5). The strength of the smooth filter in step 2 is controlled as a tradeoff between removing stripe artifacts and reducing extra artifacts. There are a number of choices for parameters and filters used in algorithm 1 and 2. Therefore, we do not go into details of these choices. Readers can refer to our implementation using the link provided in section 4.

To our surprise, algorithm 3 is the simplest approach but very efficient. It works particularly well on partial stripes which other published methods mentioned struggle to deal with. Figure 7 illustrates the use of algorithm 3 on a sinogram which results a clean reconstructed image as shown in Fig. 8. The effectiveness of algorithm 3 can be understood by considering the following approximating assumptions: Adjacent areas of the detector system are sampling the same range of incoming flux because the sample has continuous variation in intensity; the irregular areas are small compared to the representative level of details in the sample. If the detector system is ideal, the total range of brightness observed at neighboring pixels should be the same. This means that if the measured intensities are sorted in order of brightness, we should expect the same distribution in neighboring pixels. Under the assumption of small irregular areas, we can use the brightness sorted values to identify, compare and correct the irregular areas. Applying a smoothing filter is the easiest way of correction, but the presence of sharp edges in the sample may yield artifacts (bright partial stripes in Fig. 7(e)). These extra artifacts cause streak artifacts in the reconstructed image. However, they are much less problematic for further processing than artifacts generated by other ring suppression methods as will be shown in section 3.

 figure: Fig. 7

Fig. 7 Demonstration of algorithm 3 in which the image contrast is inverted to improve the visibility of stripes. (a) Original sinogram; (b) Sorted sinogram; (c) Smoothed sorted sinogram using the median filter; (d) Corrected sinogram; (e) Difference between (a) and (d).

Download Full Size | PDF

 figure: Fig. 8

Fig. 8 Comparison of reconstructed images before and after using algorithm 3: (a) Reconstructed image from the sinogram in Fig. 7(a); (b) Reconstructed image from the corrected sinogram in Fig. 7(d); (c) Magnified view from the red frame in (a); (d) Magnified view from the red frame in (b).

Download Full Size | PDF

Algorithm 3 is interesting not only for correcting stripe artifacts itself but also for being used by other algorithms. For example, it can replace step 2 in algorithm 1 or algorithm 2. Due to the risk of introducing extra artifacts caused by the smoothing filter, algorithms in Table 2 should be used for removing the small-to-medium stripes. Treatment for large stripes is presented in the next section where algorithm 3 also plays a crucial role.

2.3 Removing the large stripes

The above algorithms are able to correct the small or medium defects without significantly affecting other areas. In principle it could be selectively applied only on defective pixels but in practice the benefit is not detectable when a low strength smoothing filter is used. However, for large defects (width larger than 20 pixels), applying a stronger filter everywhere will degrade the final image. Furthermore, the large defects are few in number. In this case, selective application is beneficial but requires an extra step of defect detection. Large defects which cause type S2 and S3 stripes are not always visible in the radiograph and may come from the adjacent areas of the damaged scintillator which receive extra scattering light, i.e. the so-called halo effect, as shown in Fig. 9. To detect them efficiently from the sinogram, we introduce a segmentation algorithm working on a 1D array to separate positive defects (brighter than the background) and negative defects (darker than the background) from the background as presented in Table 3 and as shown in Fig. 10.

 figure: Fig. 9

Fig. 9 Causes and impact of large stripes. (a) Large defects (vertical arrow) and the areas around the over-exposed blob (horizontal arrows) cause large stripes in the sinogram; (b) Large stripes in the sinogram (arrowed); (c) Large ring artifacts in the reconstructed image come from the large stripes in (b).

Download Full Size | PDF

Tables Icon

Table 3. Steps of the algorithm for detecting defects (called algorithm 4)

 figure: Fig. 10

Fig. 10 Demonstration of algorithm 4. (a) Normalized 1D array, i.e. non-uniform background is corrected. (b) Sorted array and fitted array using the middle part of the sorted array.

Download Full Size | PDF

From step 1 and 2 of algorithm 4 the following values are known: I0 and I1 are the minimum and maximum value of the sorted array; F0 and F1 are the fitted value at the first and last index of the array as shown in Fig. 10(b). Combining these values with a user-input value, R, the lower threshold (TL) and upper threshold (TU) value are calculated as follows

If (F0-I0)/(F1-F0)>R,

TL=F0(F1F0)×R/2

If (I1-F1)/(F1-F0)>R,

TU=F1+(F1F0)×R/2

R can be understood as a ratio between the defective value and the background value which users can modify to adjust the sensitivity of the algorithm. A reasonable choice of R is around 3.0 or above. Algorithm 4 can be used as a binarization method for other applications. It is convenient for users because one does not need to know the absolute values of the array.

Using above supplementary method, the algorithm for removing large stripes is proposed as described in Table 4.

Tables Icon

Table 4. Algorithm for removing large stripes (called algorithm 5)

In step 2 of algorithm 5, dropping a small percentage of pixels at the top and bottom of the processed sinograms helps to reduce the possibility of the false detection of stripes which is caused by high-frequency edges, as can be seen in Fig. 11. Step 1, 2, and 3 help to suppress part of large stripes and some small full stripes (Fig. 12(a)). In principle, it is a different approach to the normalization-based methods. Partial stripes left are removed by the last three steps of algorithm 5 as shown in Fig. 12(b).

 figure: Fig. 11

Fig. 11 Explanation of step 2 in Table 4. (a) Sorted sinogram where high-frequency edges (indicated by the arrows) can cause false detection of stripes; (b) Horizontal median filter of sinogram (a);

Download Full Size | PDF

 figure: Fig. 12

Fig. 12 Results of removing large ring artifacts in the reconstructed image shown in Fig. 9(c). (a) Corrected image after using step 1, 2, and 3 where some partial rings still remain (arrows); (b) Corrected image at the end; (c) Difference between the image in (b) and the image in Fig. 9(c).

Download Full Size | PDF

2.4 Removing unresponsive stripes and fluctuating stripes

Stripes of type S1 and S4 give rise to both ring artifacts and streak artifacts in the reconstructed image (Fig. 13). Unfortunately, they cannot be removed using the sorting approach as in the above algorithms because the rankings of the grayscales are significantly different between pixels inside the stripes and outside the stripes. The characteristic intensity profile of type S1 and S4 are different. One shows very little variation (Fig. 3(a)) while the other shows excessive variation (Fig. 3(d)) compared with their low-pass components. Using these features can help us detect them by: averaging the absolute difference between the intensity profiles and their low-pass components (Fig. 14(a)); normalizing the result (Fig. 14(b)); and using algorithm 4 to locate stripe positions. After detected, stripe S1 and S4 is removed by interpolation. Table 5 shows the steps of the algorithm for removing stripes of type S1 and S4 which is called algorithm 6.

 figure: Fig. 13

Fig. 13 Impact of unresponsive stripes and fluctuating stripes. (a) Sinogram having both type of stripes, S1 (arrow) and S4 (red frame); (b) Ring artifacts in the reconstructed image caused by the stripes in sinogram (a); (c) Over-exposed blob on the detector resulting the S1 stripe in (a); (d) Magnified view of the red frame in image (a) showing the S4 stripe; (e) Magnified view of the red frame in (b) showing the streak artifacts (arrows) caused by the S4 stripes.

Download Full Size | PDF

 figure: Fig. 14

Fig. 14 Explanation of steps 1 and 2 of algorithm 6. (a) Averaging the absolute difference between the intensity profiles and their low-pass components (blue profile); and after filtered by the median filter (red profile); (b) Division of two profiles in (a).

Download Full Size | PDF

Tables Icon

Table 5. Steps of algorithm 6 for removing unresponsive stripes and fluctuating stripes.

Figure 15 shows the results of applying algorithm 6 on the sinogram in Fig. 13(a). As stated in section 2.3, the bright blobs may cause exponential edge effects around them which results in large stripes as shown in Fig. 15(a). Therefore, algorithm 5 need to be applied after algorithm 6 (Fig. 15(c)). Besides, detection steps of algorithm 6 may not work well on sinograms of samples giving intensity profiles similar to type S1 stripe, e.g. single material samples in the cylindrical shape. In this case, algorithm 5 is certainly needed.

 figure: Fig. 15

Fig. 15 Results of correcting type S1 and S4 stripes. (a) Corrected sinogram after using algorithm 6 showing a large stripe left; (b) Reconstructed image from the sinogram in (a); (c) Reconstructed image after algorithm 5 is used; (d) Difference between the image in (c) and the image in Fig. 13(b).

Download Full Size | PDF

3. Results

3.1 Comparison of methods for removing partial stripes and full stripes

Algorithm 1, 2, 3 are comparable with other methods for removing the small-to-medium partial and full stripes, so we compare the quality of our algorithm with other four well-known methods: the normalization-based method [9], the regularization-based method [10], the FFT-based method [11], and the wavelet-FFT-based method [12]. For notational simplicity, we call them M1, M2, M3, and M4 respectively.

In the first comparison, we apply method M1, M2, M3, M4 and algorithm 3 on the sinograms of two samples scanned under identical conditions. They have similar structures and materials (limestone rocks with embedded fossils) but different in shapes at the same row of the detector. The different shapes of the samples result in different dynamic ranges of transmitted intensities (Fig. 16) which helps to reveal the existence of partial stripes. In addition, the high-frequency edges in the sinograms (arrows in Fig. 16), caused by X-ray refraction [21], help to reveal the side effects of each method.

 figure: Fig. 16

Fig. 16 (a) Sinogram of sample 1 having low dynamic range of transmitted intensities with high-frequency edges (arrows); (b) Sinogram of sample 2 having higher dynamic range of intensities than the sinogram in (a) with a high-absorption area (oval) and some high-frequency edges (arrows).

Download Full Size | PDF

On sample 1, which has a quite round shape, all methods perform equally well (Fig. 17 and Fig. 18). List of parameters used for each method are: 1) Gaussian filter, σ = 11 (method M1); 2) α = 0.005 (method M2); 3) u = 30, v = 0, n = 10 (method M3); 4) Daubechies wavelet, order = 10, level = 5, σ = 1 (method M4); 5) Median filter, size = 31 (algorithm 3). There are extra partial ring artifacts produced by other methods as indicated by the red arrows in Figs. 17(b)-17(e). Our approach does not give extra ring artifacts but gives some insignificant streak artifacts as shown in Fig. 17(f).

 figure: Fig. 17

Fig. 17 Reconstructed images of sinogram in Fig. 16(a) where the red arrows indicate extra artifacts: (a) After flat-field correction; (b) using method M1; (c) using method M2; (d) using method M3; (e) using method M4; (f) using algorithm 3.

Download Full Size | PDF

 figure: Fig. 18

Fig. 18 Magnified views of the center part of the reconstructed images in Fig. 17: (a) from Fig. 17(a); (b) from Fig. 17(b); (c) from Fig. 17(c); (d) from Fig. 17(d); (e) from Fig. 17(e); (f) from Fig. 17(f).

Download Full Size | PDF

On sample 2, which has a higher aspect ratio shape, method M1-M4 do not perform well (Figs. 19(b) –19(e)) using the same parameters as for sample 1. They not only cannot clean all rings, particularly lots of partial rings left (Fig. 20), but also give raise to extra strong ring artifacts (indicated by the red arrows in Fig. 19). The results may be improved using different set of parameters. However, this is inefficient for processing a large number of data sets at synchrotron facilities. As can be seen by comparing Fig. 19(f) and Fig. 17(f), our approach performs equally well for both sample with the same parameters. However, streak artifacts caused by fluctuating stripes still remain in reconstructed images (Fig. 18 and Fig. 20).

 figure: Fig. 19

Fig. 19 Reconstructed images of sinogram in Fig. 16(b) where the red arrows indicate extra artifacts: (a) After flat-field correction; (b) using method M1; (c) using method M2; (d) using method M3; (e) using method M4; (f) using algorithm 3.

Download Full Size | PDF

 figure: Fig. 20

Fig. 20 Magnified views of the center part of the reconstructed images in Fig. 19: (a) From Fig. 19(a); (b) from Fig. 19(b); (c) from Fig. 19(c); (d) from Fig. 19(d); (e) from Fig. 19(e); (f) from Fig. 19(f).

Download Full Size | PDF

Results of the first test show that for the M1-M4 methods it is impossible to use the same parameters for different samples or even for different slices of the same sample to suppress stripe artifacts efficiently. Our approach uses the same parameter for each sample, enabling efficient processing of a large number of data sets.

In the second comparison, we particularly test the capability of removing partial stripes. The sample used has a rectangular shape extending beyond the field of view [22]; which results a strong absorption at the angles nearly parallel to the long axis of the sample. This condition reveals partial stripes in the sinogram (Figs. 21(a)-21(c)) causing partial ring artifacts in the reconstructed image (Figs. 21(d) and (e)). For the M1-M4 methods, using the same parameters as in the first test has no impact on partial ring artifacts (Figs. 22(a.1)-22(d.1)). We increased the strength of these methods by changing parameters to: 1) σ = 17, c = 30 (M1); 2) α = 0.001, c = 30 (M2); 3) u = 30, v = 10, n = 10 (M3); 4) order = 10, level = 5, σ = 10 (M4); in which the c (chunk) parameter is used to increase the strength of method M1 and method M2 by dividing the sinogram into many chunks of rows [23]. These settings clean the partial rings artifacts but also erase the details in the center area of the reconstructed image, called void-center artifacts, as shown in the second row of Figs. 22(a.2)-22(d.2). Our approach, however, returns a clean image without extra artifacts in both cases of using the median filter with the window sizes of 31 and 51 (Figs. 22(e.1) and (e.2), respectively).

 figure: Fig. 21

Fig. 21 Partial stripes caused by the high dynamic range of the incoming intensities. (a) Sinogram of a sample in rectangular shape; (b)-(c) Magnified view of the red frames at the top and bottom in image (a) showing partial stripes; (d) Reconstructed image from sinogram (a); (e) Magnified view of the red frame in image (d).

Download Full Size | PDF

 figure: Fig. 22

Fig. 22 Magnified views of the reconstructed images from method M1-M4 and algorithm 3 using different parameters. The first row shows the results of method M1 (a.1), method M2 (b.1), method M3 (c.1), method M4 (d.1), and algorithm 3 (e.1) in which the same parameters as in the first test are used. The second row shows the results of method M1 (a.2), method M2 (b.2), method M3 (c.2), method M4 (d.2), and algorithm 3 (e.2) using different parameters.

Download Full Size | PDF

In the last comparison, we test the methods on a very challenging data where stripe artifacts are produced, blurred and enlarged due to the use of a pre-processing method. Figure 23(a) shows a sinogram of a sample has a very low contrast and contains features outside the field of view [24]. A strong low-pass filter [25] is applied to 2D projections to improve the contrast. It enlarges and blurs detector defects which introduce stripe artifacts to the adjacent sinogram as shown in Fig. 23(b). Here, we found that algorithm 1 provide the best result compared with algorithm 2, algorithm 3 in removing blurry stripes. The processing technique used in step 1 of algorithm 1 may reduce the side effects of a strong smoothing filter used in step 2. Unfortunately, algorithm 1 can only be used for processing sinograms having low dynamic range of intensities. As can be seen from Figs. 23(c) and (f), algorithm 1 using a polynomial order of 3 and a strong Gaussian-Fourier filter (σx = 2, σy = 30) returns a clean image without extra artifacts. The M1-M4 methods give extra ring artifacts and non-uniform background in the reconstructed image (indicated by the arrows in Fig. 24).

 figure: Fig. 23

Fig. 23 Blurry stripes caused by using a pre-processing method on the 2D projections and results of removing them using algorithm 1. (a)-(b) Sinogram at the same row of the detector before and after the pre-processing method was used; (c) Reconstructed image from sinogram in (b); (d) Result of step 1 of algorithm 1 using the polynomial order of 3; (e) Corrected sinogram after a strong 2D low-pass filter is used; (f) Reconstructed image from sinogram in (e).

Download Full Size | PDF

 figure: Fig. 24

Fig. 24 Reconstructed images using method M1-M4 showing extra artifacts indicated by the arrows. (a) Method M1 (σ = 121); (b) method M2 (α = 0.00001); (c) method M3 (u = 2, v = 1, n = 10); (d) method M4 (order = 11, level = 3, σ = 10).

Download Full Size | PDF

3.2 Combination of methods for removing all types of stripes

In the previous examples, only a few sinograms with certain types of stripes were used for demonstration. In practice, one needs to process all sinograms of a 3D sample where different types of stripes can appear at different locations or all together in one place. It is impossible for a single method to remove all of them. For example, algorithm 3 and methods M1-M4 cannot remove fluctuating stripes as shown in Fig. 18 and Fig. 20. They also cannot be used to remove unresponsive stripes which require an interpolation method for filling missing values. Algorithm 6 cannot be used alone, it needs algorithm 5 to clean residual stripes. In general, we need to combine algorithm 6, algorithm 5, and one of algorithms in Table 2 for not only removing all types of stripes but also reducing the risk of introducing extra artifacts.

The order of presenting our algorithms in this report is based on their order of dependency on each other, however, in practice they should be used in reverse order (6 5 1, 6 5 2, or 6 5 3). Algorithm 5 needs to be used after algorithm 6 as explained above. Algorithm 1, 2, or 3 should be used after algorithm 5, otherwise they may enlarge large stripes.

Figure 25 shows the results of combining algorithm 6, algorithm 5, and algorithm 3 for processing a full tomographic data set where we used the same set of parameters for all slices. The reconstruction results of not using a ring removal method and using a very popular method, the wavelet-FFT-based approach, are also shown for comparison (see Visualization 1 and Visualization 2). As can be seen in Fig. 25(c) (see Visualization 3) ring artifacts are removed from nearly all reconstructed slices. Few of very large but low-contrast ring artifacts still remain which come from the detector areas impacted by the halo effect. It is very hard to detect them in this instance. For algorithm 6, low-pass components are retrieved in step 1 using a mean filter with a radius of 30, a median filter with a size of 81 was used in step 2, R of 3 was used in step 3, and a linear interpolation method was used in step 4. For algorithm 5, the size of the median filter and the value of R are the same as used in algorithm 6; and 5% of pixels was dropped in step 2. A median filter with a size of 31 was used for algorithm 3.

 figure: Fig. 25

Fig. 25 (a) Reconstructed slices with ring artifacts (see Visualization 1). (b) Reconstructed slices after using the wavelet-FFT-based method where the parameters are the same as used in Fig. 17(e) (see Visualization 2). (c) Reconstructed slices after using algorithm 6, algorithm 5, and algorithm 3 (see Visualization 3).

Download Full Size | PDF

The combination of three algorithms appears to have many parameters. However, the parameters are largely determined in a straightforward way by the size and the brightness of detector defects which relate to the size of the smoothing filter and the value of R, respectively. These are features of the detector systems at the beamline and we found that these set of parameters works for most of tomographic data collected over time at I12-JEEP [26]. Other parameters such as the size of the mean filter in step 1 of algorithm 6 or the percentage of dropped pixels in step 2 of algorithm 5 is insensitive and can be fixed. Algorithm 5 and algorithm 6 can use the same some parameters. At the end, only three crucial parameters need to be adjusted: the size of the smooth filter and the value of R used for both algorithm 5 and 6; and the size of the smooth filter used for algorithm 3.

4. Conclusion

Basing on the characteristic intensity profiles of defective pixels of the detector systems, we classify stripe artifacts into four types: partial stripes, full stripes, fluctuating stripes, and unresponsive. They require different ways of treatment, here we propose the combination of three approaches for not only eliminating all types of stripes but also reducing the residual artifacts they may cause. These approaches are developed from our two main contributions: a class of correction methods based on equalizing or smoothing the response curves of adjacent pixels (equalization-based methods) and a method of defect detection based on the combination of sorting, fitting, and thresholding techniques.

The proposed techniques are easy to use and to implement. Their superior quality of performance compared with other methods is demonstrated on challenging data having different dynamic ranges, partial stripes, and blurry stripes. The comparison results shown that the parameters of our methods are consistent between different samples which make them easy to use; our approaches work particularly well on partial stripes; and they can handle a very challenging type of stripe which is blurred and enlarged by the pre-processing method. The ability to eliminate nearly all ring artifacts of a 3D data set by combining three approaches in which their parameters are the same for all slices are shown. Implementations in Python of our algorithms and the M1-M4 methods are available at: https://github.com/nghia-vo/sarepy. The data sets used in this report and our reconstruction results can be downloaded from Zenodo https://doi.org/10.5281/zenodo.1443568.

Processing techniques we used can also improve the quality performance of other stripe removal methods such as sorting before filtering to avoid void-center artifacts, or limiting the range of sorted values at the top and bottom to avoid the false detection of stripes. The method of detecting stripes can be used as a binarization method in other applications. We also show a different implementation to the normalization-based methods which is the combination of sorting, smoothing, and normalizing the intensity profiles.

Acknowledgments

This work was carried out with the support of the Diamond Light Source. We thank James Marrow, Matt Molteno, Richard Abel, and Baptiste Coudrillier for the permission of presenting their tomographic data.

References

1. A. C. Kak and M. Slaney, Principles of Computerized Tomographic Imaging, (IEEE, 1988).

2. J. A. Seibert, J. M. Boone, and K. K. Lindfors, “Flat-field correction technique for digital detectors,” Proc. SPIE 3336, Medical Imaging: Physics of Medical Imaging (1998), pp. 348–354.

3. N. Wadeson and M. Basham, “Savu: a Python-based, MPI framework for simultaneous processing of multiple, N-dimensional, large tomography datasets,” (2016) https://arxiv.org/abs/1610.08015.

4. D. Gürsoy, F. De Carlo, X. Xiao, and C. Jacobsen, “Tomopy: a framework for the analysis of synchrotron tomographic data,” J. Synchrotron Radiat. 21(Pt 5), 1188–1193 (2014). [CrossRef]   [PubMed]  

5. F. Brun, L. Massimi, M. Fratini, D. Dreossi, F. Billé, A. Accardo, R. Pugliese, and A. Cedola, “SYRMEP Tomo Project: a graphical user interface for customizing CT reconstruction workflows,” Adv. Struct. Chem. Imaging 3(1), 4 (2017). [CrossRef]   [PubMed]  

6. T. E. Gureyev, Y. Nesterets, D. Ternovski, D. Thompson, S. Wilkins, A. Stevenson, A. Sakellariou, and J. A. Taylor, “Toolbox for advanced X-ray image processing,” Proc. SPIE 8141, 81410B (2011).

7. A. Mirone, E. Brun, E. Gouillart, P. Tafforeau, and J. Kieffer, “The PyHST2 hybrid distributed code for high speed tomographic reconstruction with iterative reconstruction and a priori knowledge capabilities,” Nucl. Instrum. Methods Phys. Res. B 324, 41–48 (2014). [CrossRef]  

8. M. Vogelgesang, S. Chilingaryan, T. d. Rolo, and A. Kopmann, “UFO: a scalable GPU-based image processing framework for on-line monitoring,” in IEEE International Conference on High Performance Computing and Communication & Embedded Software and Systems (2012) pp. 824–829.

9. M. Rivers, “Tutorial Introduction to X-ray Computed Microtomography Data Processing,” http://www.mcs.anl.gov/research/projects/X-ray-cmt/rivers/tutorial.html.

10. S. Titarenko, P. J. Withers, and A. Yagola, “An analytical formula for ring artefact suppression in X-ray tomography,” Appl. Math. Lett. 23(12), 1489–1495 (2010). [CrossRef]  

11. C. Raven, “Numerical Removal of Ring Artifacts in Microtomography,” Rev. Sci. Instrum. 69(8), 2978–2980 (1998). [CrossRef]  

12. B. Münch, P. Trtik, F. Marone, and M. Stampanoni, “Stripe and ring artifact removal with combined wavelet--Fourier filtering,” Opt. Express 17(10), 8567–8591 (2009). [CrossRef]   [PubMed]  

13. J. Sijbers and A. Postnov, “Reduction of ring artefacts in high resolution micro-CT reconstructions,” Phys. Med. Biol. 49(14), N247–N253 (2004). [CrossRef]   [PubMed]  

14. M. Drakopoulos, T. Connolley, C. Reinhard, R. Atwood, O. Magdysyuk, N. Vo, M. Hart, L. Connor, B. Humphreys, G. Howell, S. Davies, T. Hill, G. Wilkin, U. Pedersen, A. Foster, N. De Maio, M. Basham, F. Yuan, and K. Wanelik, “I12: the Joint Engineering, Environment and Processing (JEEP) beamline at Diamond Light Source,” J. Synchrotron Radiat. 22(3), 828–838 (2015). [CrossRef]   [PubMed]  

15. N. T. Vo, M. Drakopoulos, R. C. Atwood, and C. Reinhard, “Reliable method for calculating the center of rotation in parallel-beam tomography,” Opt. Express 22(16), 19078–19086 (2014). [CrossRef]   [PubMed]  

16. N. T. Vo, R. C. Atwood, and M. Drakopoulos, “Radial lens distortion correction with sub-pixel accuracy for X-ray micro-tomography,” Opt. Express 23(25), 32859–32868 (2015). [CrossRef]   [PubMed]  

17. N. T. Vo, “Reconstruction scripts for time series tomography,” https://confluence.diamond.ac.uk/display/I12Tech/Reconstruction+scripts+for+time+series+tomography.

18. W. van Aarle, W. J. Palenstijn, J. Cant, E. Janssens, F. Bleichrodt, A. Dabravolski, J. De Beenhouwer, K. Joost Batenburg, and J. Sijbers, “Fast and flexible X-ray tomography using the ASTRA toolbox,” Opt. Express 24(22), 25129–25147 (2016). [CrossRef]   [PubMed]  

19. G. N. Ramachandran and A. V. Lakshminarayanan, “Three-dimensional reconstruction from radiographs and electron micrographs: application of convolutions instead of Fourier transforms,” Proc. Natl. Acad. Sci. U.S.A. 68(9), 2236–2240 (1971). [CrossRef]   [PubMed]  

20. D. F. Swinehart, “The Beer-Lambert Law,” J. Chem. Educ. 39(7), 333 (1962). [CrossRef]  

21. A. Snigirev, I. Snigireva, V. Kohn, S. Kuznetsov, and I. Schelokov, “On the possibilities of x-ray phase contrast microimaging by coherent high-energy synchrotron radiation,” Rev. Sci. Instrum. 66(12), 5486–5492 (1995). [CrossRef]  

22. M. Molteno, “Measuring fracture properties using digital image and volume correlation: decomposing the J-integral for mixed-mode parameters,” PhD thesis (2017).

23. S. Titarenko, V. Titarenko, A. Kyrieleis, P. J. Withers, and F. De Carlo, “Suppression of ring artefacts when tomographing anisotropically attenuating samples,” J. Synchrotron Radiat. 18(3), 427–435 (2011). [CrossRef]   [PubMed]  

24. B. Coudrillier, D. M. Geraldes, N. T. Vo, R. Atwood, C. Reinhard, I. C. Campbell, Y. Raji, J. Albon, R. L. Abel, and C. R. Ethier, “Phase-contrast micro-computed tomography measurements of the intraocular pressure-induced deformation of the porcine lamina cribrosa,” IEEE Trans. Med. Imaging 35(4), 988–999 (2016). [CrossRef]   [PubMed]  

25. D. Paganin, S. C. Mayo, T. E. Gureyev, P. R. Miller, and S. W. Wilkins, “Simultaneous phase and amplitude extraction from a single defocused image of a homogeneous object,” J. Microsc. 206(1), 33–40 (2002). [CrossRef]   [PubMed]  

26. M. Polacci, F. Arzilli, G. La Spina, N. Le Gall, B. Cai, M. E. Hartley, D. Di Genova, N. T. Vo, S. Nonni, R. C. Atwood, E. W. Llewellin, P. D. Lee, and M. R. Burton, “Crystallisation in basaltic magmas revealed via in situ 4D synchrotron X-ray microtomography,” Sci. Rep. 8(1), 8377 (2018). [CrossRef]   [PubMed]  

Supplementary Material (3)

NameDescription
Visualization 1       Reconstructed slices of a 3D tomographic dataset without ring artifacts suppresion.
Visualization 2       Reconstructed slices of a 3D tomographic dataset after ring artifacts suppression using the wavelet-FFT-based method.
Visualization 3       Reconstructed slices of a 3D tomographic dataset after ring artifacts suppression using algorithm 6, algorithm 5, and algorithm 3 of our approaches.

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (25)

Fig. 1
Fig. 1 Impact of defects in a detector system: (a) Stripe artifacts in the sinogram; (b) Ring artifacts in the reconstructed image.
Fig. 2
Fig. 2 Degradation of a scintillator during an experiment: (a) At the beginning of the experiment; (b) After few days of use.
Fig. 3
Fig. 3 Characteristic intensity profile of a defective pixel (red color) in comparison with an adjacent good pixel (blue color) for distinguishing the type of a stripe: (a) Unresponsive stripe; (b) Full stripe; (c) Partial stripe; (d) Fluctuating stripe.
Fig. 4
Fig. 4 (a) Visual demonstration of the S2, S3, and S4 stripe. (b) Magnified view from the blue frame in (a) shows the fluctuating stripe. (c), (d), and (e) Ring artifacts caused by the S2, S3, S4 stripes, respectively.
Fig. 5
Fig. 5 Demonstration of the difference between the flat-field image and the defective pixel map. (a) Flat-field image; (b) Flat-field correction of a sample projection; (c) Defective map showing the intercept values of the linear fit; (d) Magnified view from the rectangular box in (a); (e) Magnified view from the rectangular box in (b); (f) Magnified view from the rectangular box in (c) showing defective regions not visible in (d).
Fig. 6
Fig. 6 Occurrence of the ring artifacts depends on the dynamic range of the incoming intensities. (a) Reconstructed image from the sample giving a low dynamic range of transmitted intensities; (b) Reconstructed image of the same detector row from the sample giving a high dynamic range of transmitted intensities, which shows the partial rings (arrowed).
Fig. 7
Fig. 7 Demonstration of algorithm 3 in which the image contrast is inverted to improve the visibility of stripes. (a) Original sinogram; (b) Sorted sinogram; (c) Smoothed sorted sinogram using the median filter; (d) Corrected sinogram; (e) Difference between (a) and (d).
Fig. 8
Fig. 8 Comparison of reconstructed images before and after using algorithm 3: (a) Reconstructed image from the sinogram in Fig. 7(a); (b) Reconstructed image from the corrected sinogram in Fig. 7(d); (c) Magnified view from the red frame in (a); (d) Magnified view from the red frame in (b).
Fig. 9
Fig. 9 Causes and impact of large stripes. (a) Large defects (vertical arrow) and the areas around the over-exposed blob (horizontal arrows) cause large stripes in the sinogram; (b) Large stripes in the sinogram (arrowed); (c) Large ring artifacts in the reconstructed image come from the large stripes in (b).
Fig. 10
Fig. 10 Demonstration of algorithm 4. (a) Normalized 1D array, i.e. non-uniform background is corrected. (b) Sorted array and fitted array using the middle part of the sorted array.
Fig. 11
Fig. 11 Explanation of step 2 in Table 4. (a) Sorted sinogram where high-frequency edges (indicated by the arrows) can cause false detection of stripes; (b) Horizontal median filter of sinogram (a);
Fig. 12
Fig. 12 Results of removing large ring artifacts in the reconstructed image shown in Fig. 9(c). (a) Corrected image after using step 1, 2, and 3 where some partial rings still remain (arrows); (b) Corrected image at the end; (c) Difference between the image in (b) and the image in Fig. 9(c).
Fig. 13
Fig. 13 Impact of unresponsive stripes and fluctuating stripes. (a) Sinogram having both type of stripes, S1 (arrow) and S4 (red frame); (b) Ring artifacts in the reconstructed image caused by the stripes in sinogram (a); (c) Over-exposed blob on the detector resulting the S1 stripe in (a); (d) Magnified view of the red frame in image (a) showing the S4 stripe; (e) Magnified view of the red frame in (b) showing the streak artifacts (arrows) caused by the S4 stripes.
Fig. 14
Fig. 14 Explanation of steps 1 and 2 of algorithm 6. (a) Averaging the absolute difference between the intensity profiles and their low-pass components (blue profile); and after filtered by the median filter (red profile); (b) Division of two profiles in (a).
Fig. 15
Fig. 15 Results of correcting type S1 and S4 stripes. (a) Corrected sinogram after using algorithm 6 showing a large stripe left; (b) Reconstructed image from the sinogram in (a); (c) Reconstructed image after algorithm 5 is used; (d) Difference between the image in (c) and the image in Fig. 13(b).
Fig. 16
Fig. 16 (a) Sinogram of sample 1 having low dynamic range of transmitted intensities with high-frequency edges (arrows); (b) Sinogram of sample 2 having higher dynamic range of intensities than the sinogram in (a) with a high-absorption area (oval) and some high-frequency edges (arrows).
Fig. 17
Fig. 17 Reconstructed images of sinogram in Fig. 16(a) where the red arrows indicate extra artifacts: (a) After flat-field correction; (b) using method M1; (c) using method M2; (d) using method M3; (e) using method M4; (f) using algorithm 3.
Fig. 18
Fig. 18 Magnified views of the center part of the reconstructed images in Fig. 17: (a) from Fig. 17(a); (b) from Fig. 17(b); (c) from Fig. 17(c); (d) from Fig. 17(d); (e) from Fig. 17(e); (f) from Fig. 17(f).
Fig. 19
Fig. 19 Reconstructed images of sinogram in Fig. 16(b) where the red arrows indicate extra artifacts: (a) After flat-field correction; (b) using method M1; (c) using method M2; (d) using method M3; (e) using method M4; (f) using algorithm 3.
Fig. 20
Fig. 20 Magnified views of the center part of the reconstructed images in Fig. 19: (a) From Fig. 19(a); (b) from Fig. 19(b); (c) from Fig. 19(c); (d) from Fig. 19(d); (e) from Fig. 19(e); (f) from Fig. 19(f).
Fig. 21
Fig. 21 Partial stripes caused by the high dynamic range of the incoming intensities. (a) Sinogram of a sample in rectangular shape; (b)-(c) Magnified view of the red frames at the top and bottom in image (a) showing partial stripes; (d) Reconstructed image from sinogram (a); (e) Magnified view of the red frame in image (d).
Fig. 22
Fig. 22 Magnified views of the reconstructed images from method M1-M4 and algorithm 3 using different parameters. The first row shows the results of method M1 (a.1), method M2 (b.1), method M3 (c.1), method M4 (d.1), and algorithm 3 (e.1) in which the same parameters as in the first test are used. The second row shows the results of method M1 (a.2), method M2 (b.2), method M3 (c.2), method M4 (d.2), and algorithm 3 (e.2) using different parameters.
Fig. 23
Fig. 23 Blurry stripes caused by using a pre-processing method on the 2D projections and results of removing them using algorithm 1. (a)-(b) Sinogram at the same row of the detector before and after the pre-processing method was used; (c) Reconstructed image from sinogram in (b); (d) Result of step 1 of algorithm 1 using the polynomial order of 3; (e) Corrected sinogram after a strong 2D low-pass filter is used; (f) Reconstructed image from sinogram in (e).
Fig. 24
Fig. 24 Reconstructed images using method M1-M4 showing extra artifacts indicated by the arrows. (a) Method M1 (σ = 121); (b) method M2 (α = 0.00001); (c) method M3 (u = 2, v = 1, n = 10); (d) method M4 (order = 11, level = 3, σ = 10).
Fig. 25
Fig. 25 (a) Reconstructed slices with ring artifacts (see Visualization 1). (b) Reconstructed slices after using the wavelet-FFT-based method where the parameters are the same as used in Fig. 17(e) (see Visualization 2). (c) Reconstructed slices after using algorithm 6, algorithm 5, and algorithm 3 (see Visualization 3).

Tables (5)

Tables Icon

Table 1 Classification of stripes

Tables Icon

Table 2 Algorithms for removing stripes of type S2 and S3

Tables Icon

Table 3 Steps of the algorithm for detecting defects (called algorithm 4)

Tables Icon

Table 4 Algorithm for removing large stripes (called algorithm 5)

Tables Icon

Table 5 Steps of algorithm 6 for removing unresponsive stripes and fluctuating stripes.

Equations (2)

Equations on this page are rendered with MathJax. Learn more.

T L = F 0 ( F 1 F 0 )×R/2
T U = F 1 +( F 1 F 0 )×R/2
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.