Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Radon single-pixel imaging with projective sampling

Open Access Open Access

Abstract

A novel technique for Radon single-pixel imaging with projective sampling, which is based on the theorem of the Radon transform, is proposed. In contrast to current patterns in conventional single-pixel imaging systems, candy-striped patterns called Radon basis patterns, which are produced by projecting the 1D Hadamard functions along different angles, are employed in our proposed technique. Here, the patterns are loaded into a projection system and then illuminated onto an object. The light reflected from the object is detected by a single-pixel detector. An iterative reconstruction method is used to restore the object’s 1D projection functions by summing the 1D Hadamard functions and detected intensities. Next, the Radon spectrum of the object is recovered by arranging the 1D projection functions along the projection angle. Finally, the image of the object can be recovered using a filtered back-projection algorithm with the Radon spectrum. Experiments demonstrate that the proposed technique can obtain the information of the Radon spectrum and image of the object. Recognition directly in the Radon spectrum domain, rather than in the image domain, is fast and yields robust and high classification rates. A recognition experiment is performed by detecting the lines in one scene by searching the singular peaks in the Radon spectrum domain. According to the results, the lines in the scene can be easily detected in the Radon spectrum domain. Other shapes can also be detected by the characteristics of those shapes in the Radon spectrum domain.

© 2019 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

1. Introduction

Recently, an imaging system called a single-pixel imaging (SPI) system [1–7] utilizing a non-scanning single-pixel detector has attracted increasing attention. Although a single-pixel detector is used in this system, the mechanism is different from that of a traditional scanning system. In the SPI system, many pixels are measured simultaneously instead of scanning pixels by pixel over an entire object, which in practice improves the measurement signal-to-noise ratio (SNR) due to the higher detectable light intensity. Here, a series of structured speckles produced by a spatial light modulator (SLM) is employed to illuminate the object, and then the reflected or transmitted intensity from the object is detected by a single detector. The known series of structured speckles and the measured intensities can be combined and inverted using a variety of algorithms to yield a good estimated image of the object. Despite the challenges posed by the low frame rates exhibited by single-pixel techniques when compared with matrix detectors, SPI has also achieved success recently in specific imaging and various applications, such as terahertz imaging [8,9],microscope imaging [10,11], fluorescence imaging [12–14], gas imaging [15] and so on [16–20].

Initially, speckles with random patterns were widely employed in SPI. However, using such speckles requires a great number of measurements and a long data-acquisition time for recording signals. For instance, Sun et al. used 106 random patterns to reconstruct an image with a resolution of 256 × 192 pixels [1]. Even with so many measurements, the quality of images reconstructed by SPI is hardly comparable to that of images obtained by conventional matrix detectors. Although compressed sensing (CS) [2] can also be employed to reduce the total number of illumination speckles, this algorithm has some shortcomings requiring more computational time than that required for calculating correlation. The time consumed increases exponentially with the data volume, which is directly related to the size of the signal to be recovered. In practice, CS algorithms are very difficult to implement for large signals with high pixel counts or may even fail to work under specific circumstances. Recently proposed techniques [3–6] can address several of the problems associated with long data-acquisition times and low reconstruction quality. These techniques use deterministic orthogonal basis patterns instead of non-orthogonal random patterns for illumination. Fourier SPI (FSI) [3] and Hadamard SPI (HSI) [4–6] are two representative SPI techniques that use deterministic patterns. HSI uses Hadamard basis patterns for illumination and acquires the Hadamard spectrum of an object, while FSI uses Fourier basis patterns and acquires a Fourier spectrum of an object. Finally, applying the inverse Hadamard or Fourier transform to the obtained spectrum yields the desired image. These patterns are gradually beginning to be widely used in SPI.

In this paper, we demonstrate an SPI technique called Radon SPI that can extract the Radon spectrum of an object with projective sampling. The Radon spectrum [21,22] of a given object can be achieved by summing its line integrals along all given angles by the Radon transform (RT), which is a non-invasive imaging technique allowing for the visualization of the internal structures of an object without the superposition of over- and underlying structures that usually plagues conventional projection images. RT has been widely applied in many areas of image processing, such as tomographic reconstruction [23,24], pattern recognition [25,26], shape and motion detection [27,28], and classification [29,30]. In the traditional setup, the optical RT of a given object is performed by illuminating an object with a linear light spot. The reflected light from the object, which gives a signal proportional to the line integral of the reflectivity along the linear light spot, is captured by a single-pixel detector. The detected intensity represents a single point in the RT domain of the object. The complete RT is achieved by varying the orientation of the linear light spot over different angles and scanning the area of the object. A mechanical movement device is always used to perform this rotation and scanning. In recent years, matrix detectors, such as CCD or photosensor arrays, have also been used to obtain the information of an object in the RT domain. The authors of [22], for example, used a simple 4F setup and a single optical element to retreive the RT of an object. In this case, the RT information obtained from the CCD is given in Cartesian coordinates, which can subsequently be transformed to polar coordinates using a computer. The filtered back-projection algorithm (FBP) is employed as the method of inverse RT to recover the image of the object. Furthermore, classification and recognition directly in the RT domain, rather than in the image domain, are fast and yield robust and high classification rates [29]. An optical RT photosensor array [29,30] was recently implemented to achieve classification rates of 98%-99% for complex hand gesture and motion detection tasks in the RT domain. However, to our knowledge, no SPI technique has been able to achieve the Radon spectrum of an object to date. Here, the proposed Radon SPI technique is based on the theorem of the Radon transform. A Radon spectrum is obtained by illuminating a given object with a series of candy-striped patterns of different slopes and using a single-pixel detector to collect the reflected light. In Section 2, we introduce the imaging principles and methods. In Section 3, the quantity simulations and experiments are carried out to evaluate our proposed methods. In Section 4, the discussion and conclusions of this work are described.

2. Principles and methods

2.1 Radon transform

For a continuous 2D object function f(x,y), the RT is formed by performing a line integration of the image intensity f(x,y) with N × N pixels along a line, that is, at a distance L from the origin and at projection angle θ' to the x-axis. All points on this line satisfy the following equation:

L=xcosθ'+ysinθ',

The Radon transform of a distribution f(x,y) is given [21] by

F(L,θ')=x,yf(x,y)δ(xcosθ'+ysinθ'L),
The collection F(L, θ') at all angles θ' and all distances L is called the RT of the input image f(x,y), where the term δ is the Dirac delta function. In our proposed method, many lines are measured simultaneously instead of scanning a line over the whole object. This approach distinguishes the proposed method from the traditional scanning method for RT. One line of the F(L, θ') at one certain angle is 1D projection function (1DPF) Fθ' (L) of the object at this angle and all distances L. The Radon spectrum of an object is often referred to as a sinogram because the RT of an off-center point source is a sinusoid.

2.2 Radon spectrum obtained by SPI

Assuming that the n-th complementary illumination patterns with projection angle θ' is employed to illuminate the object, the reflected light from the object is detected by the single-pixel detector. This process can be expressed mathematically as follows:

Iθ'+(n)=x,yf(x,y)pn+(x,y;L,θ'),
Iθ'(n)=x,yf(x,y)pn(x,y;L,θ'),
Here, p+n(x,y;L,θ’) and p-n(x,y;L,θ’) represent the complementary distributions of illumination patterns, which have many straight lines with the same values along projection angle θ' to the x-axis, in the object space, and Iθ'(n) represents the detected intensity. Subtracting Eq. (4) from Eq. (3) yields the following equation:
Iθ'(n)=Iθ'+(n)Iθ'(n)=x,yf(x,y)[pn+(x,y;L,θ')pn(x,y;L,θ')]=x,yf(x,y)pn(x,y;L,θ'),
The 2D patterns pn(x,y;L,θ’) can be described as follows:
pn(x,y;L,θ')=C(R)Tn,θ'(L)δ(xcosθ'+ysinθ'L).
Here, Tn,θ' (L) is the n-th 1D Hadamard function (1DHF) at projection angle θ' to the x-axis, and the function C(R) of the circular area indicates that the values of the elements are one if the diameter is less than R and zeros otherwise. The above expression shows that in the whole sampling area, the values of the lines in the patterns pn(x,y;L,θ’) with the same angle are determined by the value of Tn(L) at the corresponding distance L. Substituting Eq. (6) into Eq. (5) gives
Iθ'(n)=x,yC(R)Tn,θ'(L)f(x,y)δ(xcosθ'+ysinθ'L).
Supposing that the object is in the circular area, when the length of the straight line can cover the object, the change in length will not affect the detected result. The expression can be further simplified as follows:
Iθ'(n)=x,yTn,θ'(L)Fθ'(L).
The equation indicates that the detected intensity is the result of the interaction between the 1D Hadamard function (1DHF) and 1D projection function (1DPF) of the object at projection angle θ'. Therefore, according to the SPI algorithm for a one-dimensional signal, the 1D projection function (1DPF) at the angle θ' can also be recovered by the following equation:
Fθ'(L)=nTn,θ'(L)Iθ'(n).
When the 1D projection functions (1DPF) at all angles are recovered, the information of the sinogram can be obtained by arranging the 1D projection function (1DPF) along the different angles. The foregoing analysis demonstrates that the correlation algorithm of the SPI can realize the RT of the object.

2.3 Reconstruction by filtered back-projection algorithm

While obtaining an image of the object by back projection (BP) is conceptually simple, a back-projected image is very blurry [31]. The point spread function of back projection is circularly symmetric and decreases with the reciprocal of its radius. FBP is a technique for correcting the blurring encountered in simple back projection and is employed to recover the image of the object [31]. We define the back projection [31], Bo, at a point (x,y) as

Bo(x,y)=1π0πo(xcosθ'+ysinθ',θ')dθ'.
where o represents a representation of a certain Radon spectrum. Applying this back projection formula to the above obtained Radon transform spectrum, we obtain the following:

BFθ'(L)=1π0πFθ'(L)dθ'.

The equation for the filtered back-projection [31] can be expressed as

f(x,y)=12B(Ft1SFθ'(L))(x,y).
Here, S is a low-pass filter in the frequency domain, Ft1 is the inverse Fourier transform, and * is the convolution operation. The filtered back-projection algorithm with Ram-Lak filter S [31] is calculated with the iradon MATLAB 2015 function.

1D projection function (1DPF) is one line of the sinogram image. A 2D projection function (2DPF) is obtained by projecting the 1DPF along the integral angle in 2D coordinates. Figure 1 A shows a schematic diagram of the RT. The entire procedure of the proposed technique is demonstrated in Fig. 1 B. First, the computer program produces a 2D Hadamard matrix (HM). Then, 1D Hadamard functions (1DHF) T(L), which are the basis of the 2D Hadamard matrix, are extracted by line from the Hadamard matrix. Next, 2D patterns with binary candy stripes, which are called Radon basis patterns, are produced by projecting the 1DHF along different integral angles θ (0°< = θ<180°). The 2D projective patterns under three integral angles (0°, 45° and 90°) are shown in Fig. 1 A and expressed as 2DS-0, 2DS-45 and 2DS-90. The values of the black and white pixels in the images are equal to −1 and 1, respectively. Next, the patterns are loaded into the projection system and then illuminated onto the object, which can be described as projective illumination (PI). The reflected light from the object is detected by a single-pixel detector. The 1D projection function (1DPF), which is the intensity distribution of the object along an integral angle θ, can be restored using the correlation between the 1DHF and detected intensities under the angle θ. Finally, the sinogram of the object can be obtained by arranging the 1DPF along projection angles. This process produces the Radon spectrum of the object. The image of the object can then be obtained by performing filtered back-projection algorithm with the sinogram.

 figure: Fig. 1

Fig. 1 A) Radon Transform diagram. Sonogram is the Radon spectrum. 1D projection function (1DPF) is one line of the sinogram image. A 2D projection function (2DPF) is obtained by projecting the 1DPF along the integral angle in 2D coordinates. B) Procedure of the proposed method: The letter C in the circle represents the correlation operation; HM: Hadamard matrix; 1DH: 1D Hadamard functions; PI: projective illumination. The 2D projective patterns under three integral angles (00, 450 and 900) are expressed as 2DS-0, 2DS-45 and 2DS-90.

Download Full Size | PDF

It is worth noting that various types of 2D original matrices can be used in our system. A random matrix, which causes each line of the 2D patterns to have a random distribution of binary or gray values, can be employed. However, the method requires a great number of measurements and a long time for imaging using a correlation algorithm [18]. The basis of the Hadamard matrix, on the contrary, is composed of orthogonal discrete square waves with values of either + 1 or −1. The 2D patterns with binary candy stripes, which are projected by the basis of the HM, also have two values, + 1 or −1. Because of the projective transformation, the 2D patterns are no longer orthogonal. However, the information can still be efficiently reconstructed with a linearly iterative algorithm [4,5], which requires very low computational complexity. In our experimental system, a digital micro mirror device (DMD) is employed as the SLM and can create binary patterns at high projection rates. Thus, the Hadamard matrix is chosen for our experiments. Nonetheless, the illumination patterns produced by the DMD system are binary, and negative reflection values cannot be readily utilized. This issue is circumvented by performing double illuminations for each measurement. Because the detected intensity is linear with the patterns, the coefficient under the 2D patterns P can be calculated by subtracting the two detected intensities. This process can also remove the disturber from the environment, and the SNR of the measurements is improved [4–6].

3. Experimental results

3.1 Computational simulations

Two different types’ images (binary and grayscale) are used to evaluate our proposed method. The images with 128 × 128 pixels and their Radon spectrums are shown in Fig. 2. The percentage of root mean squared error (RMSE) is used to further quantitatively evaluate the quality of the recovered results, which is calculated by

RMSE=x,y=1M,N[fr(x,y)fo(x,y)]2/M×N,
where fr and fo are the values of the (x,y)-th pixel in the reconstructed and original images respectively; and N and M are the dimensions of the image. All images are normalized to unity. The smaller the RMSE is, the better the recovered quality is.

 figure: Fig. 2

Fig. 2 Original information for simulation: A1 and A2 are original images; B1 and B2 are corresponding Radon spectrum.

Download Full Size | PDF

2D original matrices have random and Hadamard forms for simulation studies. When random form is employed, a 128 × K matrix is produced, and K is the number of columns. Then K × 180 2D projective patterns at integral angle θ (0°< = θ<180°, interval angle is 1°) are generated. When Hadamard form is employed, a 128 × 128 HM is produced. Then 128 × 180 pairs of complementary 2D projective patterns at integral angle θ are generated as patterns with values of 0 or 1 according to the above mentioned method.

Figure 3 shows the results of the restoration of the binary image. The first and second lines respectively represent the Radon spectrums and recovered images under random form 2D original matrices, and the numbers of 2D projective patterns are 51200 × 180, 204800 × 180, 819200 × 180 and 3276800 × 180, respectively. It can be seen from the results that as the number of samples increases, the quality of the restoration results is getting better and better. The RMSEs of recovered Radon spectrums in the first row are 24%, 15%, 8%, and 5%, respectively, and the RMSEs of recovered images in the second row are 33%, 23%, 14%, and 8%, respectively. In Fig. 3, the third and fourth lines respectively represent the Radon spectrum and restored image under Hadamard form 2D original matrices. We select a certain proportion of the projection patterns and the corresponding measurement intensities to calculate the information. Here, each row in the figures of third and fourth lines provides the recovered results under 32 × 180, 64 × 180, 96 × 180 and 128 × 180 projective patterns. Taking the 32 × 180 projective patterns as an example, 32 × 180 of the patterns at every integral angle are employed to recover the information by selecting the top 32 × 180 of patterns with the most significant intensities. It can be seen from the results that the quality of the restoration results is getting better with the increase of the number of projective patterns. The RMSEs of the recovered Radon spectrums in the third row are 17%, 8%, 2% and 0.3%, respectively. And the RMSEs of the recovered images in the fourth row are 38%, 26%, 9% and 7%, respectively.

 figure: Fig. 3

Fig. 3 Restoration results of binary image. The results in first and second lines are Radon spectrums and recovered images under random form original matrices and the numbers of projective patterns are 51200 × 180, 204800 × 180, 819200 × 180 and 3276800 × 180 for A to D rows, respectively. The results in third and fourth lines are Radon spectrums and recovered images under Hadamard form original matrices, and the numbers of projective patterns are 32 × 180, 64 × 180, 96 × 180 and 128 × 180 for A to D rows, respectively.

Download Full Size | PDF

Figure 4 shows the restoration results of the grayscale image. The results in first and second lines represent the Radon spectrums and images restored under the random form 2D original matrices, respectively. The numbers of 2D projective patterns are 51200 × 180, 204800 × 180, 819200 × 180 and 3276800 × 180, respectively. The RMSEs of the first row are 22%, 18%, 13% and 8%, respectively. The RMSEs of the second row are 39%, 36%, 29% and 19%, respectively. The results in third and fourth lines represent the Radon spectrums and images restored by the Hadamard form 2D original matrices. Here, each row in the figures of third and fourth lines provides the recovered results under 32 × 180, 64 × 180, 96 × 180 and 128 × 180 projective patterns, respectively. The RMSEs of the results in third line are 6%, 5%, 1% and 0.4%, respectively, and the RMSEs of the results in fourth line are 19%, 17%, 8% and 7%, respectively. With the increase of the number of samples, the quality of the recovered results is getting better and better. However, when Hadamard form 2D original matrix is used, high quality recovered results can be obtained with fewer projective patterns. In the next experiment, we experimented with patterns generated by Hadamard matrix.

 figure: Fig. 4

Fig. 4 Restoration results of grayscale image. The results in first and second lines are Radon spectrums and recovered images under random form original matrices, and the numbers of projective patterns are 51200 × 180, 204800 × 180, 819200 × 180 and 3276800 × 180 for A to D rows, respectively. The results in third and fourth lines are Radon spectrums and recovered images under Hadamard form original matrices, and the numbers of projective patterns are 32 × 180, 64 × 180, 96 × 180 and 128 × 180 for A to D rows, respectively.

Download Full Size | PDF

At last, we also carried out simulation experiments under different projection angle intervals, and the results are shown in Fig. 5. The first line from A to C indicates that the projection angle intervals are 8, 6, and 4 degrees, respectively. And the projective pattern numbers are 128 × 22, 128 × 30, and 128 × 45, respectively. The second line from A to C indicates that the projection angle interval is 2, 1, and 0.5 degrees, respectively. And the projective pattern numbers are 128 × 90, 128 × 180, and 128 × 360, respectively. The RMSEs of the corresponding recovery results are 27%, 24%, 18%, 9%, 8%, and 7.6% from top-left, respectively. According to the results, it can be seen that as the projection angle interval decreases, the quality of the corresponding restoration result becomes better and better. When projection angle interval less than 1 degree, the recovered result is not obviously improved. As the angle interval becomes smaller, more projective patterns are required. Thus in the next laboratory experiments we use 1 degree angle interval.

 figure: Fig. 5

Fig. 5 Grayscale images of the object reconstructed with different projection angle intervals. A1 to C2 (Top-to-Left) represent recovered results with projection angle intervals of 8, 6, 4, 2, 1, and 0.5 degrees, respectively.

Download Full Size | PDF

3.2 Laboratory experiments

The proposed technique is also studied by an experimental system, which is described as follows. A LASEVER LSR532NL-100 532-nm continuous wave (CW) laser serves as the light source. A DMD system (Texas Instruments Discovery V7001 with 1024 × 768 micro-mirrors) is used to generate illumination patterns. A single-pixel detector (Thorlabs PMT-PMM02) and a data-acquisition system (NI USB-6211) are employed for light detection and data acquisition, respectively. The laser beam enters a beam expander (BE) and then passes through the DMD, providing the 2D patterns, which are projected to the object by a projection lens (PL) with a focal length of 125 mm. The reflected light from the object is collected by collecting lens (CL) with a focal length of 100 mm and then detected by the single-pixel detector (SD, Thorlabs PMT-PMM02). Next, the values of the intensities are sent to a computer through the data-acquisition system (DAS). Operating system of computer is Microsoft Window 7 with Intel Core i5 CPU and 4G RAM using Matlab 2015. The object (OB) is located ~1.9 m from the imaging system. The experimental setup is shown in Fig. 6. To compare image acquisitions, we use the correlation coefficient between the undersampled images and a reference image. This coefficient ranges from zero to one, depending on the resemblance of the two sets of images. Our reference image is always the image acquired without undersampling (i.e., measuring at the Nyquist-Shannon criterion). The correlation coefficient is calculated with the following function:

r=mn(AmnA¯)(BmnB¯)mn(AmnA¯)2mn(BmnB¯)2,
where A and B are the image matrices with indices m and n, respectively, and A¯ and B¯ represent the mean values of the elements in A and B.

 figure: Fig. 6

Fig. 6 The experimental system (A) and the process of projective patterning (B). Lim. Are. is circular limitative area; Pro. Patt. are projective patterns.

Download Full Size | PDF

The intermediate 768 × 768 mirrors of the DMD are utilized in the experiment, and the 6 × 6 mirrors are combined into a pattern cell that corresponds to an image pixel. Thus, the resolution of the entire lighted area is 128 × 128 pixels. The pre-process is described as follows. First, a 128 × 128 HM (one pixel corresponds to 6 × 6 mirrors of the DMD) is produced, and then 128 × 180 pairs of complementary 2D patterns at integral angle θ (0°< = θ<180°, interval angle is 1°) are generated as patterns with values of 0 or 1 according to the above mentioned method. Changing the integral angles, this process will cause some output projected patterns to be larger than the entire DMD area. Here, we crop the projected patterns and make all of the projected patterns the same size using 768 × 768 mirrors. Furthermore, all the projection patterns are limited to a circular area with a diameter of 768 mirrors. Thus, the sampled area at the object position is circular. An example of three patters with different angles from the same 1DHF is shown in the right part in Fig. 6 B. All of the patterns are produced by this method and then loaded into the DMD system. Finally, the illumination patterns produced by the DMD system are employed to illuminate the object, and the reflected light is detected by the single-pixel detector.

One plastic binary cartoon animal, Pig, is employed as the object in the first experiment. Sinograms represent the RT spectrum of the object, and FBPs represent the recovered images using filtered back-projection algorithms to the RT spectrum. Sinograms gathered under different compressed rates are shown in the first line in Fig. 7. The results show that high-quality recovered images can be obtained. However, radioactive artifacts exist in the recovered images due to insufficient radial sampling and angular quantization error. The compression ratio (CR) is defined as the ratio of the number of projection patterns to the total number of patterns. Taking the compressed rate 25% as an example, 25% of the patterns at every integral angle are employed to recover the information by selecting the top 25% of patterns with the most significant intensities measured by the single-pixel detector. The differences among the recovered images with coverage spanning CRs ranging from 50% to 100% are nearly negligible. The correlation coefficients between the recovered images and the reconstruction utilizing the complete patterns are 0.970, 0.995, 0.999 and 1.000. At the same time, the correlation coefficients between the recovered sinogram and the reconstruction utilizing the complete patterns are 0.995, 0.997, 0.999 and 1.000, respectively.

 figure: Fig. 7

Fig. 7 The information of the object reconstructed with different compressed rates. Sinogram is the RT information of the object, and FBP is the recovered image with the filtered back-projection method. Scale bar, 2 cm.

Download Full Size | PDF

To produce the above-mentioned recovery results, an integral angle interval of one degree is utilized. Next, we study the recovered images, which are shown in Fig. 8, with different integral angle intervals. Each row shows the restored images at the same angle interval and different compression ratios (CRs). Here, T2, T4, T6 and T8 indicate angle intervals of two, four, six and eight degrees, respectively, for integral angles over the range 0°≤θ<180°. The recovered images are gradually degraded as the interval angle increases at the same CR. The quality of the recovered images improves with the increase in CR at the same interval angle, although the quality does not significantly improve with coverage CRs ranging from 50% to 100%. The correlation coefficients in Fig. 8 confirm this conclusion.

 figure: Fig. 8

Fig. 8 The images of the object reconstructed with different projection angle intervals and compressed rates. The blue numbers indicate the correlation coefficients between the recovered images and the reconstruction utilizing the complete patterns and projection angle interval of one degree. T2, T4, T6 and T8 indicate angle intervals of two, four, six and eight degrees, respectively. Scale bar, 2 cm.

Download Full Size | PDF

The straight line shows stable invariance with translation, rotation and scaling. Correct and effective extraction of linear features has important value in applications such as pattern recognition and classification. In this section, we provide an applicational example of our proposed technique to detect straight lines in one scene. Figure 9 shows the original scene and the experimental results. A scene containing three lines with different slopes and lengths, one circular, one heart-shaped and one oval-shaped pattern, is employed in this experiment. To study the immunity of this method to noise, we add Gaussian white noise of mean 0 and variance 0.1 to the scene. We print the scene on one sheet of A4 paper as an object for detecting line research. The results are shown in Fig. 9. The object image recovered by the filtered back-projection algorithm is shown in the right of Fig. 9. Due to deviations between the coordinates of the projection system and object spaces, there are some differences between the recovered and original image. To maintain the recovered image, there is no secondary processing for eliminating significant edges, which appear at the boundaries between the sampling and non-sampling regions due to the noise and angular quantization error. Nonetheless, these edges can be removed based on prior knowledge of the limited circle. Sinogram information of the object is shown in the middle of Fig. 9. According to the results, three singular peaks, which are marked by three arrows and show the locational information of the straight line, appear in the sinogram image. The number of singular peaks is equal to the number of lines. Thus, there are three straight lines in the scene. The abscissas of the singular peaks give the inclinations of the lines, which are 49, 98 and 159 degrees. The inclination is relative to the coordinate system of the projection system space. The ordinate of the singular peaks give the distances between the center and the lines, which are 24, 7 and −5 pixels, respectively. When the line is located on the left hand of the center, the distance value is positive, and vice versa. According to the parameters (abscissas and intercepts), the lines can be marked, as shown in Fig. 10 (A). In addition to this, a 2D projection function (2DPF) can be produced by projecting the 1D projection function (1DPF) along the integral angle. In Fig. 10, 2DPFs at 49, 98 and 159 degrees contain lines of singular values and are limited in the circular area. Then, threshold processing (TP) is used to detect the lines, and the intercepts of those three lines can be obtained. Finally, three lines are marked in different colors in the recovered image, which is shown as MK_OB in Fig. 10 (B). Because the integral angle interval is one degree, the detection accuracy of inclination is one degree. When a more accurate detection value is required, the integral angle interval needs to be further reduced.

 figure: Fig. 9

Fig. 9 Experimental results. The left image is the original noise image, the middle image is the sinogram information, and the right image is recovered by the filtered back-projection algorithm. Scale bar, 2 cm.

Download Full Size | PDF

 figure: Fig. 10

Fig. 10 Straight line marking process: TP is threshold processing; MK_B, MK_G and MK_R represent the lines marked in blue, green and red, respectively. 2DPF-49, 2DPF-98 and 2DPF-98 are the 2D projection functions along angles of 49, 98 and 198 degrees, respectively. According to the abscissas and intercepts, the lines are marked, as shown in (A). According to the 2DPF, the lines are marked, as shown in (B).

Download Full Size | PDF

4. Discussion and conclusion

The proposed technique performs structured illumination with candy-striped speckles called Radon basis patterns to acquire the Radon spectrum of an object. We have experimentally validated this approach to obtain a Radon spectrum and space image information. The experiments of this paper are carried out for static scenes. In the experiments, we set the projection frequency of DMD to 4 kHz. For 128 × 180 pairs of complementary 2D patterns, measurement time is about 11.5 seconds. Computational time is measured for iradon function operation for 128 × 180 Radon spectrum using the functions tic and toc in MATLAB 2015, and approximately 0.02 seconds. Similarly, iterative algorithm achieves the 128 × 180 Radon spectrum using about 0.065 seconds. The total computational time is approximately 0.085 seconds. Computational time is a small amount compared to the measurement time. But this does not mean that the proposed method in this paper cannot achieve real-time imaging. In future, compressed sensing method and advanced computing devices can be utilized to reduce elapsed time. For example, in the literature [32], the authors use compressed sensing technology to achieve Radon reconstruction from sparse projections, and high quality 256 × 256 images can be recovered using 11 projection angles. That is to say, for an image of 256 × 256 pixels, the number of patterns requires 11 × 256 = 2816, and compression ratio is about 4.3%. With a DMD system of modulation rate up to 22 kHz, real-time Radon single-pixel imaging can be achieved. Of course, there are other technologies, such as machine learning technology, which can also be used to realize real-time Radon single-pixel imaging.

In the proposed technique, the number of samples is related not only to the number of pixels but also to the projection angle interval. The smaller the projection angle interval is, the greater the number of samples needed becomes. Two points must be noted: first, the superiority of the proposed technique is embodied more in the transform domain (Sinogram) than in the image domain because it is more convenient to extract and identify the characteristic parameters of the object in this domain; second, when we employ a single-pixel imaging system in certain applications, such as computerized tomography, RT is the technology that must be used. In other words, our proposed technique can expand the applications of the SPI system beyond traditional methods. Line detection is one of the most important aspects of image processing and is the basis of image segmentation. Line features are often used as high-level processing. Therefore, the recognition and positioning of lines are of great significance. For example, many objects have straight contours, and detecting and locating these straight contours provides conditions for further pattern recognition and classification. According to the analysis, the lines in the scene can be easily detected in the RT domain. Radon SPI will pave the way toward the use of SPI in several fields, including tomographic reconstruction, pattern recognition, shape and motion detection, and classification. However, the current method cannot obtain the length of the straight line, requiring us to further improve the algorithm through threshold processing of restored image. Not only linear but also quasi-linear and non-linear patterns can be exploited effectively in the Radon spectrum using a similar method.

The advantages of the proposed method and its performance relative to that of techniques reported in previous studies are discussed as follows.

First, unlike the point-by-point sinogram detection carried out in a traditional scanning RT systems, in our system, many points of a sinogram are measured simultaneously, i.e., many lines are measured simultaneously instead of scanning line-by-line over an entire object. In practice, this improves the SNR due to the higher detectable light intensity. Hence, the accuracy of the detection results is higher. In addition, there is no scanning device in our system; thus, the system setup is simpler. More importantly, our proposed method can use the sparse characteristics of an object to reduce the number of sampling points and improve performance.

Second, matrix detectors are usually applied when non-scanning technology is employed to achieve the RT of an object [22,29,30]. In this study, a single-pixel detector is employed to achieve the same objective; hence, there is potential for establishing lower-cost and more compact systems, especially in the infrared and terahertz region of the spectrum, where matrix detectors do not exhibit good performance compared with their performance in the visual spectrum. Moreover, in a weak-light environment, an SPI system can obtain images with higher SNRs than those recorded by matrix detectors [17].

Third, classification and recognition directly in the Radon spectrum domain, rather than in image space, are fast and yield robust and high classification rates [29]. The proposed technique, which incorporates the RT of an object, ensures improved feature detection and classification relative to conventional SPI methods. The lines in the scene can be easily detected through our proposed method for further pattern recognition and classification.

Fourth, the illumination patterns can be straightforward and easy to generate by projecting the 1D Hadamard basis. Complementary patterns can remove background disturbers and improve the SNR of measurements. The only shortcoming is that these processes will double the number of measurements required. Another important advantage is that those patterns can be easily produced by a DMD system with projection rates of up to 22 kHz, which allows for the real-time RT of an object.

However, the proposed technique has two limitations. One is the low frame rates exhibited by single-pixel techniques compared with those of matrix image detectors. The other is that the image information of an object cannot be perfectly recovered due to non-uniform sampling.

Radon SPI has remarkable advantages in several fields, including tomographic reconstruction, pattern recognition, shape and motion detection, and classification. We believe that this approach will pave the way toward the use of single-pixel imaging in these application areas. At the same time, the fields of machine and computer vision offer a wealth of more advanced approaches, such as pattern recognition and machine learning [33,34], for realizing those applications.

Funding

National Natural Science Foundation of China (Nos. 11404344, 41505019 and 41475001), the CAS Innovation Fund Project (Nos. CXJJ-17S029 and CXJJ-17S063), the Open Research Fund of Key Laboratory of Optical Engineering, Chinese Academy of Sciences (No. 2017LBC007) and Open Research Fund of State Key Laboratory of Pulsed Power Laser Technology (SKL2018ZR10).

Acknowledgments

We thank Professor Marco Peccianti and Dr Juan Sebastian Totero Gongora for their valuable discussions and help.

References

1. B. Sun, M. P. Edgar, R. Bowman, L. E. Vittert, S. Welsh, A. Bowman, and M. J. Padgett, “3D computational imaging with single-pixel detectors,” Science 340(6134), 844–847 (2013). [CrossRef]   [PubMed]  

2. M. F. Duarte, M. A. Davenport, D. Takhar, J. N. Laska, T. Sun, K. F. Kelly, and R. G. Baraniuk, “Single-pixel imaging via compressive sampling,” IEEE Signal Process. Mag. 25(2), 83–91 (2008). [CrossRef]  

3. Z. Zhang, X. Ma, and J. Zhong, “Single-pixel imaging by means of Fourier spectrum acquisition,” Nat. Commun. 6(1), 6225 (2015). [CrossRef]   [PubMed]  

4. M. J. Sun, M. P. Edgar, G. M. Gibson, B. Sun, N. Radwell, R. Lamb, and M. J. Padgett, “Single-pixel three-dimensional imaging with time-based depth resolution,” Nat. Commun. 7(1), 12010 (2016). [CrossRef]   [PubMed]  

5. M. P. Edgar, G. M. Gibson, R. W. Bowman, B. Sun, N. Radwell, K. J. Mitchell, S. S. Welsh, and M. J. Padgett, “Simultaneous real-time visible and infrared video with single-pixel detectors,” Sci. Rep. 5(1), 10669 (2015). [CrossRef]   [PubMed]  

6. R. I. Stantchev, B. Sun, S. M. Hornett, P. A. Hobson, G. M. Gibson, M. J. Padgett, and E. Hendry, “Noninvasive, near-field terahertz imaging of hidden objects using a single-pixel detector,” Sci. Adv. 2(6), e1600190 (2016). [CrossRef]   [PubMed]  

7. C. M. Watts, D. Shrekenhamer, J. Montoya, G. Lipworth, J. Hunt, T. Sleasman, S. Krishna, D. R. Smith, and W. J. Padilla, “Terahertz compressive imaging with metamaterial spatial light modulators,” Nat. Photonics 8(8), 605–609 (2014). [CrossRef]  

8. L. Olivieri, J. S. Totero Gongora, A. Pasquazi, and M. Peccianti, “Time-Resolved Nonlinear Ghost Imaging,” ACS Photonics 5(8), 3379–3388 (2018). [CrossRef]  

9. W. L. Chan, K. Charan, D. Takhar, K. F. Kelly, R. G. Baraniuk, and D. M. Mittleman, “A single-pixel terahertz imaging system based on compressed sensing,” Appl. Phys. Lett. 93(12), 121105 (2008). [CrossRef]  

10. N. Radwell, K. J. Mitchell, G. M. Gibson, M. P. Edgar, R. Bowman, and M. J. Padgett, “Single-pixel infrared and visible microscope,” Optica 1(5), 285–289 (2014). [CrossRef]  

11. R. S. Aspden, N. R. Gemmell, P. A. Morris, D. S. Tasca, L. Mertens, M. G. Tanner, R. A. Kirkwood, A. Ruggeri, A. Tosi, R. W. Boyd, G. S. Buller, R. H. Hadfield, and M. J. Padgett, “Photon-sparse microscopy: visible light imaging using infrared illumination,” Optica 2(12), 1049–1052 (2015). [CrossRef]  

12. T. Cižmár and K. Dholakia, “Exploiting multimode waveguides for pure fibre-based imaging,” Nat. Commun. 3(1), 1027 (2012). [CrossRef]   [PubMed]  

13. M. Plöschner, T. Tyc, and T. Cizmar, “Seeing through chaos in multimode fibres,” Nat. Photonics 9(8), 529–535 (2015). [CrossRef]  

14. N. Tian, Q. Guo, A. Wang, D. Xu, and L. Fu, “Fluorescence ghost imaging with pseudothermal light,” Opt. Lett. 36(16), 3302–3304 (2011). [CrossRef]   [PubMed]  

15. G. M. Gibson, B. Sun, M. P. Edgar, D. B. Phillips, N. Hempler, G. T. Maker, G. P. A. Malcolm, and M. J. Padgett, “Real-time imaging of methane gas leaks using a single-pixel camera,” Opt. Express 25(4), 2998–3005 (2017). [CrossRef]   [PubMed]  

16. H. Yu, R. Lu, S. Han, H. Xie, G. Du, T. Xiao, and D. Zhu, “Fourier-Transform Ghost Imaging with Hard X Rays,” Phys. Rev. Lett. 117(11), 113901 (2016). [CrossRef]   [PubMed]  

17. P. A. Morris, R. S. Aspden, J. E. C. Bell, R. W. Boyd, and M. J. Padgett, “Imaging with a small number of photons,” Nat. Commun. 6(1), 5913 (2015). [CrossRef]   [PubMed]  

18. N. Huynh, E. Zhang, M. Betcke, S. Arridge, P. Beard, and B. Cox, “Single-pixel optical camera for video rate ultrasonic imaging,” Optica 3(1), 26–29 (2016). [CrossRef]  

19. B. Lochocki, A. Gambin, S. Manzanera, E. Irles, E. Tajahuerce, J. Lancis, and P. Artal, “Single pixel camera ophthalmoscope,” Optica 3(10), 1056–1059 (2016). [CrossRef]  

20. L. Martínez-León, P. Clemente, Y. Mori, V. Climent, J. Lancis, and E. Tajahuerce, “Single-pixel digital holography with phase-encoded illumination,” Opt. Express 25(5), 4975–4984 (2017). [CrossRef]   [PubMed]  

21. S. R. Deans, The Radon Transform and Some of Its Applications (Wiley, 1983).

22. T. Ilovitsh, A. Ilovitsh, J. Sheridan, and Z. Zalevsky, “Optical realization of the radon transform,” Opt. Express 22(26), 32301–32307 (2014). [CrossRef]   [PubMed]  

23. A. C. Kak and M. Slaney, Principles of Computerized Tomographic Imaging (SIAM, 2001).

24. G. T. Herman, Fundamentals of Computerized Tomography: Image Reconstruction from Projections, 2nd edn. (Springer, 2010).

25. J. S. Seo, J. Haitsma, T. Kalker, and C. D. Yoo, “A robust image fingerprinting system using the Radon transform,” Signal Process-Image 19(4), 325–339 (2004). [CrossRef]  

26. J. D. Wu and S. H. Ye, “Driver identification using finger-vein patterns with Radon transform and neural network,” Expert Syst. Appl. 36(3), 5793–5799 (2009). [CrossRef]  

27. S. Tabbone and L. Wendling, “Technical symbols recognition using the two-dimensional radon transform,” Int C Patt Recog, 200–203 (2002). [CrossRef]  

28. Y. Kashter, O. Levi, and A. Stern, “Optical compressive change and motion detection,” Appl. Opt. 51(13), 2491–2496 (2012). [CrossRef]   [PubMed]  

29. A. Koppelhuber and O. Bimber, “A classification sensor based on compressed optical Radon transform,” Opt. Express 23(7), 9397–9406 (2015). [CrossRef]   [PubMed]  

30. O. Bimber, “An image sensor based on optical Radon transform,” Comput Graph-Uk 53, 37–43 (2015). [CrossRef]  

31. Beatty, Jen, “The Radon Transform and the Mathematics of Medical Imaging” (2012). Honors Theses. Paper 646.

32. K. Egiazarian, A. Foi, and V. Katkovnik, “Compressed Sensing Image Reconstruction via Recursive Spatially Adaptive Filtering,” Proc. IEEE ICIP 2007, San Antonio (TX), USA, pp. 549–552, Sept. 2007. [CrossRef]  

33. C. M. Bishop, Pattern Recognition and Machine Learning (Springer-Verlag, 2006).

34. C. E. Rasmussen and C. K. I. Williams, Gaussian Processes for Machine Learning (Massachusetts Institute of Technology, 2006).

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (10)

Fig. 1
Fig. 1 A) Radon Transform diagram. Sonogram is the Radon spectrum. 1D projection function (1DPF) is one line of the sinogram image. A 2D projection function (2DPF) is obtained by projecting the 1DPF along the integral angle in 2D coordinates. B) Procedure of the proposed method: The letter C in the circle represents the correlation operation; HM: Hadamard matrix; 1DH: 1D Hadamard functions; PI: projective illumination. The 2D projective patterns under three integral angles (00, 450 and 900) are expressed as 2DS-0, 2DS-45 and 2DS-90.
Fig. 2
Fig. 2 Original information for simulation: A1 and A2 are original images; B1 and B2 are corresponding Radon spectrum.
Fig. 3
Fig. 3 Restoration results of binary image. The results in first and second lines are Radon spectrums and recovered images under random form original matrices and the numbers of projective patterns are 51200 × 180, 204800 × 180, 819200 × 180 and 3276800 × 180 for A to D rows, respectively. The results in third and fourth lines are Radon spectrums and recovered images under Hadamard form original matrices, and the numbers of projective patterns are 32 × 180, 64 × 180, 96 × 180 and 128 × 180 for A to D rows, respectively.
Fig. 4
Fig. 4 Restoration results of grayscale image. The results in first and second lines are Radon spectrums and recovered images under random form original matrices, and the numbers of projective patterns are 51200 × 180, 204800 × 180, 819200 × 180 and 3276800 × 180 for A to D rows, respectively. The results in third and fourth lines are Radon spectrums and recovered images under Hadamard form original matrices, and the numbers of projective patterns are 32 × 180, 64 × 180, 96 × 180 and 128 × 180 for A to D rows, respectively.
Fig. 5
Fig. 5 Grayscale images of the object reconstructed with different projection angle intervals. A1 to C2 (Top-to-Left) represent recovered results with projection angle intervals of 8, 6, 4, 2, 1, and 0.5 degrees, respectively.
Fig. 6
Fig. 6 The experimental system (A) and the process of projective patterning (B). Lim. Are. is circular limitative area; Pro. Patt. are projective patterns.
Fig. 7
Fig. 7 The information of the object reconstructed with different compressed rates. Sinogram is the RT information of the object, and FBP is the recovered image with the filtered back-projection method. Scale bar, 2 cm.
Fig. 8
Fig. 8 The images of the object reconstructed with different projection angle intervals and compressed rates. The blue numbers indicate the correlation coefficients between the recovered images and the reconstruction utilizing the complete patterns and projection angle interval of one degree. T2, T4, T6 and T8 indicate angle intervals of two, four, six and eight degrees, respectively. Scale bar, 2 cm.
Fig. 9
Fig. 9 Experimental results. The left image is the original noise image, the middle image is the sinogram information, and the right image is recovered by the filtered back-projection algorithm. Scale bar, 2 cm.
Fig. 10
Fig. 10 Straight line marking process: TP is threshold processing; MK_B, MK_G and MK_R represent the lines marked in blue, green and red, respectively. 2DPF-49, 2DPF-98 and 2DPF-98 are the 2D projection functions along angles of 49, 98 and 198 degrees, respectively. According to the abscissas and intercepts, the lines are marked, as shown in (A). According to the 2DPF, the lines are marked, as shown in (B).

Equations (14)

Equations on this page are rendered with MathJax. Learn more.

L=xcosθ'+ysinθ',
F(L,θ')= x,y f(x,y)δ(xcosθ'+ysinθ'L) ,
I θ' + (n)= x,y f(x,y) p n + (x,y;L,θ') ,
I θ' (n)= x,y f(x,y) p n (x,y;L,θ') ,
I θ' (n)= I θ' + (n) I θ' (n) = x,y f(x,y)[ p n + (x,y;L,θ') p n (x,y;L,θ')] = x,y f(x,y) p n (x,y;L,θ') ,
p n (x,y;L,θ')=C(R) T n,θ' (L)δ(xcosθ'+ysinθ'L).
I θ' (n)= x,y C(R) T n,θ' (L)f(x,y)δ(xcosθ'+ysinθ'L) .
I θ' (n)= x,y T n,θ' (L) F θ' (L) .
F θ' (L)= n T n,θ' (L) I θ' (n).
Bo(x,y)= 1 π 0 π o(xcosθ'+ysinθ',θ')dθ' .
B F θ' (L)= 1 π 0 π F θ' (L)dθ' .
f(x,y)= 1 2 B( F t 1 S F θ' (L))(x,y).
RMSE= x,y=1 M,N [ f r (x,y) f o (x,y)] 2 / M×N ,
r= m n ( A mn A ¯ )( B mn B ¯ ) m n ( A mn A ¯ ) 2 m n ( B mn B ¯ ) 2 ,
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.