## Abstract

In this paper, we propose a ghost imaging scheme with fast Walsh–Hadamard transform, named GIFWHT. In the scheme, Walsh–Hadamard pattern pairs are used to illuminate an object to generate pairs of detection results, and the corresponding differential detection result is used as the result as that from the conventional bucket detector. By performing the fast Walsh–Hadamard transform on ${2}^{k}$ ($k$ is a positive integer) differential detection results, the image of the object can be recovered. The experimental and numerical simulation results show that the reconstruction time of GIFWHT is greatly reduced, and the quality of the recovered image is noticeably improved. In addition, GIFWHT is robust against interference from environmental illumination and could save memory.

© 2016 Chinese Laser Press

## 1. INTRODUCTION

Ghost imaging (GI), also referred to as single-pixel imaging, is an intriguing optical imaging technique that allows the imaging of objects located in optically harsh or noisy environments [1], and presents the possibility of imaging with special wavelengths, such as x ray and infrared [2]. Commonly, there are two optical beams in a GI system. One beam, called the signal beam, crosses an object and is detected by a bucket detector without any spatial resolution. The other beam, called the reference beam, is detected by a spatially resolving detector. The image is retrieved at the reference beam by correlating the signals from the two detectors.

Since the characteristics obtained with quantum entangled photons [3,4] are also found to be realized by using a classical pseudo-thermal light source [5], GI has been explored in the classical domain [6–13]. Later, a new configuration, called computational ghost imaging (CGI) [14], was introduced. It makes one detector system available, and enlarges the applications of GI [15,16]. However, shorter reconstruction time and higher quality of imaging is still the objective of GI.

Recently, a novel GI scheme with Fourier transform, called GIFT, was proposed to significantly decrease reconstruction time, and to achieve imaging even in the presence of noisy environmental illumination [17]. Noticeably, the sinusoidal patterns, instead of the random speckle patterns, are used to illuminate the object, and Fourier transform, instead of the second-order correlation algorithm or the compressed sensing algorithm, is employed to retrieve the image of the object. However, commonly used digital devices inevitably introduce some quantization errors for the sinusoidal patterns. Furthermore, only approximate Fourier coefficients can be achieved in the realization of GIFT.

On the other hand, Walsh–Hadamard patterns can be generated by digital devices without any quantization errors. More importantly, Walsh–Hadamard transform (WHT) has a fast implementation algorithm, called fast Walsh–Hadamard transform (FWHT) [18,19], that can significantly reduce computational complexity. In addition, inverse WHT is equal to WHT due to the orthogonal and symmetric properties of the natural order WHT, and that makes the reconstruction procedure exact.

In this paper, we propose a GI scheme with FWHT, called GIFWHT, which can realize fast reconstruction and high-quality imaging. In the scheme, Walsh–Hadamard pattern pairs, from a WHT matrix, are used to illuminate the object to generate detection pairs, and the differential detection result instead of the result of a conventional bucket detector is recorded to efficiently remove interference caused by the environmental illumination. After ${2}^{k}$ ($k$ is a positive integer) differential detections are achieved, FWHT is performed to exactly recover the image of the object. The advantage of the proposed scheme is that it can greatly reduce reconstruction time comparing to the GIFT and CGI schemes. In addition, different from GIFT and CGI, the imaging is exactly recovered by GIFWHT and the quality of the recovered image is noticeably improved. Moreover, GIFWHT can significantly reduce memory that is required to record illumination patterns. This is because the image of the object can be recovered by using the differential signals without the illumination patterns, whereas both the differential signals and the illumination patterns must be recorded to be used in the reconstruction process of existing CGI schemes.

The organization of the paper is as follows. In Section 2, GIFWHT is presented. In Section 3, the performance of GIFWHT is discussed and compared to CGI with random speckle patterns [15], CGI with Walsh–Hadamard patterns [16], and GIFT [17]. Finally, Section 4 concludes the paper.

## 2. GI USING FWHT

In this section, we first present details of the proposed scheme, along with the natural order WHT matrix and the computational complexity analysis of the proposed scheme.

#### A. Scheme Description

Figure 1 shows the schematic diagram of the proposed GIFWHT scheme. The light is modulated by a digital micromirror device (DMD) or other digital light projector, which is controlled by a computer to produce pairs of Walsh–Hadamard patterns, including Walsh–Hadamard pattern ${I}_{i}(x,y)$ and its inverse pattern ${\tilde{I}}_{i}(x,y)$, where the value of ${I}_{i}(x,y)$ is either black (0) or white ($+1$) for each coordinate $(x,y)$, and ${\tilde{I}}_{i}(x,y)=1-{I}_{i}(x,y)$. The $i$th Walsh–Hadamard pattern ${I}_{i}(x,y)$ is obtained by reshaping the $i$th row of the natural order WHT matrix ${\mathbf{H}}_{k}$. Thus a set of the pairs of Walsh–Hadamard patterns, $\cdots ,{I}_{i}(x,y),{\tilde{I}}_{i}(x,y)$, ${I}_{i+1}(x,y),{\tilde{I}}_{i+1}(x,y),\cdots $, is obtained and displayed on the DMD to modulate the light in turn. After beam expansion with a projector lens, the light with the pattern interacts with an object and is detected by a bucket detector after a converging lens. The pairs of detection results ${B}_{i}$ and ${\tilde{B}}_{i}$ are obtained for each pair of Walsh–Hadamard patterns, and the differential signal ${D}_{i}={B}_{i}-{\tilde{B}}_{i}$ is recorded. After $N={2}^{k}$ ($k$ is a positive integer) differential signals ${\{{D}_{i}\}}_{i=1}^{N}$ are achieved, and the image can be recovered by using FWHT.

The differential acquisition method [16,20] can efficiently remove the interference of environmental illumination. The system displays each pair of a Walsh–Hadamard pattern ${I}_{i}(x,y)$ and its inverse pattern ${\tilde{I}}_{i}(x,y)$, and obtains a differential signal between the two corresponding detection results from the bucket detector. Corresponding to a pattern ${I}_{i}(x,y)$, a detection result ${B}_{i}$ could be expressed as [16]

where $\eta $ is the bucket detector responsivity, $T(x,y)$ is the distribution function of the object, and $n$ represents the environmental illumination. Similarly, corresponding to ${\tilde{I}}_{i}(x,y)$, a detection result ${\tilde{B}}_{i}$ could be expressed as Therefore, a differential signal ${D}_{i}$ between the two corresponding detection results ${B}_{i}$ and ${\tilde{B}}_{i}$ isFor $N={2}^{k}$ differential signals, Eq. (3) can be written in a matrix form:

Importantly, there exists a fast algorithm of natural order WHT [18], i.e., FWHT, to recover the image of the object with a shorter reconstruction time in GIFWHT. This is realized by a butterfly structure and recursively breaks down an $N$-point natural order WHT into two smaller $\frac{N}{2}$-point natural order WHTs, and needs only $N\text{\hspace{0.17em}}{\mathrm{log}}_{2}\text{\hspace{0.17em}}N$ real-number addition or subtraction operations, which greatly decreases the computational time of natural order WHT [18,19].

#### B. Natural Order WHT

It is important to obtain the pairs of Walsh–Hadamard patterns in the proposed scheme. Equation (5) shows the Walsh–Hadamard patterns can be produced from the natural order WHT matrix ${\mathbf{H}}_{k}$. When the value of ${\mathbf{H}}_{k}$ for the coordinate $(x,y)$ is $+1$, the corresponding intensity value of ${I}_{i}(x,y)$ is white ($+1$), and when the value of ${\mathbf{H}}_{k}$ for the coordinate $(x,y)$ is $-1$, the corresponding intensity value of ${I}_{i}(x,y)$ is black (0). Because a general base-p natural order WHT matrix can be achieved as a generalization of the base-2 natural order WHT matrix [19], the base-2 natural order WHT matrix is considered in the scheme for simplification. The base-2 natural order WHT matrix of dimension 2, ${\mathbf{H}}_{1}$, is

The base-2 natural order Hadamard transform matrix of dimension ${2}^{k}$ ($k$ is a positive integer and $k\ge 2$), is given by the following recursive formula:#### C. Computational Complexity Analysis

In this subsection, the computational complexity of the proposed scheme is analyzed. Because FWHT is used in GIFWHT to recover the image of the object, the computational complexity of GIFWHT depends on that of FWHT, and only $N\text{\hspace{0.17em}}{\mathrm{log}}_{2}\text{\hspace{0.17em}}N$ real-number additions or subtractions are needed [18,19].

Since the fast Fourier Transform algorithm (FFT) is used as the reconstructed method in GIFT [17], the computational complexity of GIFT is determined by that of FFT, which needs $N\text{\hspace{0.17em}}{\mathrm{log}}_{2}\text{\hspace{0.17em}}N$ complex-number additions and $\frac{1}{2}N\text{\hspace{0.17em}}{\mathrm{log}}_{2}\text{\hspace{0.17em}}N$ complex-number multiplications [19]. Therefore, the computational complexity of GIFWHT can be significantly reduced in comparison with that of GIFT.

On the other hand, the image of the object ${T}^{\prime}(x,y)$ can be reconstructed by a second-order correlation algorithm in CGI [15,16], which can be expressed as

where $\u27e8D\u27e9=\frac{1}{N}\sum _{i=1}^{N}{D}_{i}$ and ${I}_{i}(x,y)$ has $N$ pixels. Based on Eq. (12), there exists ${N}^{2}+N-1$ real-number additions and ${N}^{2}$ real-number multiplications in the reconstruction process. It is also shown that the computational complexity of GIFWHT is much lower than that of CGI [15,16].## 3. RESULTS DISCUSSION

In this section, we discuss the performance of the proposed scheme by experiments and numerical simulations.

The experimental system of GIFWHT is shown in Fig. 2. The DMD (TI DLPC350) is controlled by a computer to modulate the light from LEDs to generate a binary Walsh–Hadamard pattern pair, ${I}_{i}(x,y)$ and ${\tilde{I}}_{i}(x,y)$. After expanding with a projector lens, the beams with the Walsh–Hadamard pattern pairs are projected onto an object. Here, transmission objects with frontal illumination are adopted and the images of the “NUPT” logo, a ghost, and a face ($18\text{\hspace{0.17em}}\mathrm{mm}\times 18\text{\hspace{0.17em}}\mathrm{mm}$) printed on a transparent plastic thin sheet are used as objects. The transmissive beam is collected by a lens (focal length is 250 mm) and then detected by a bucket detector (Thorlabs S120C and PM100USB), which is positioned in the focal plane of the lens, to generate a pair of detection results ${B}_{i}$ and ${\tilde{B}}_{i}$. The operation is repeated $N={2}^{k}$ ($k$ is 12 in the experiment) times. Finally, FWHT is adopted to recover the image of the object. The recovery process is conducted on a Dell computer with Intel i7-4790 CPU, 8G RAM, and a 64 bit Windows 7 system.

To compare the quality of the recovered image quantitatively, mean square error (MSE) is used as an objective evaluation, and is defined as [15]

where $T(x,y)$ and ${T}^{\prime}(x,y)$ denote the intensity values of the original and the recovered image, respectively. $N$ is the number of pixels of the image.At first, we verify the feasibility of GIFWHT by experiments and numerical simulations in Fig. 3. The results are compared with those results using CGI with random patterns, CGI with Walsh–Hadamard patterns, and GIFT. In the GIFWHT scheme, 4096 $64\times 64$ pixel binary Walsh–Hadamard patterns pairs (total of 8192 patterns) are modulated by DMD to illuminate onto the object both in the experiment and the numerical simulation. Simultaneously, CGI with random patterns is implemented with 8192 $64\times 64$ pixel random binary patterns (2 times the image pixels). CGI with Walsh–Hadamard patterns is realized with the same patterns as GIFWHT, and GIFT is implemented with 8192 $64\times 64$ pixel four-step phase-shifting sinusoid patterns. The results show that the images of the objects can be recovered by GIFWHT. The recovered images are very clear, and the reconstruction time is shortest. Comparing with the results using CGI with random patterns, the reconstructed images using GIFWHT are much better, and the corresponding MSEs are smaller, because the random patterns usually used in CGI [5,14] contain overlap information, whereas the Walsh–Hadamard patterns are spatially orthogonal without any redundancy. The reconstruction time of GIFWHT noticeably decreases and is 2.5% of the reconstruction time of using CGI with random patterns. Comparing with the results using GIFT, GIFWHT can recover higher quality images without the appearance of fringes and periodic noise, because GIFWHT can avoid quantization errors in implementation. Additionally, the reconstruction time of GIFWHT reduces to 16% of the reconstruction time of GIFT. It is also shown that the reconstruction quality of GIFWHT is the same as that of CGI with Walsh–Hadamard patterns, whereas GIFWHT can reduce the reconstruction time to 5.6% of the reconstruction time of CGI with Walsh–Hadamard patterns. In addition, the reconstruction quality of the experiment results is lower than that of the numerical simulation results because the projector cannot create perfectly binary patterns in the experiment. The imperfections of the projectors, such as fluctuations of the light source, result in degradation of the experimental results. Moreover, GIFWHT uses only 186 KB of memory to record the detection signals, whereas CGI not only uses 186 KB of memory to record the detection signals, but also uses 194 MB of memory to record the illumination patterns. Thus GIFWHT can greatly reduce the memory needed to record the illumination patterns comparing with the CGI scheme.

We then, in Fig. 4, compare the reconstruction time versus the different sizes of objects by using GIFWHT, GIFT, CGI with random patterns, and CGI with Walsh–Hadamard patterns. The reconstruction processes for the experiment and numerical simulation are the same when using computers. “Lena” images with $M$ pixel sizes ($M$ ranges from ${2}^{10}$ to ${2}^{14}$) are used as objects. GIFWHT and CGI with Walsh–Hadamard patterns are implemented with $M$ binary Walsh–Hadamard pattern pairs (total number of patterns is $2M$). CGI with random patterns is realized with $2M$ binary random patterns, and GIFT is implemented with $2M$ four-step phase-shifting sinusoid patterns. The results show that the reconstruction time of GIFWHT is the shortest. Moreover, the rate of increase of the reconstruction time of GIFWHT is much lower than that of CGI. Simultaneously, the reconstruction time of GIFWHT approximates 16% of the reconstruction time of GIFT.

We further show in Fig. 5 the robustness of GIFWHT against environmental illumination by experiment. Here, four fluorescent lamps are introduced as environmental illumination. The other experimental conditions are the same as those in Fig. 3. The results show that the quality of recovered images in the presence of environmental illumination is almost the same as those results in the absence of the environmental illumination, which indicates that GIFWHT has the ability to efficiently overcome interference from environmental illumination.

## 4. CONCLUSION

We have proposed a GIFWHT scheme. In the scheme, Walsh–Hadamard pattern pairs are used to illuminate the object. The differential acquisition method is used to remove interference caused by environmental illumination. After ${2}^{k}$ differential signals have been obtained, FWHT is adopted to recover the image of the object. We have compared the performance of GIFWHT, GIFT, CGI with random patterns, and CGI with Walsh–Hadamard patterns by experiments and numerical simulations and have analyzed the quality of the imaging of GIFWHT in the presence or absence of environmental illumination. The results have shown that the reconstruction time can be greatly reduced by using GIFWHT comparing with that of other schemes. Moreover, GIFWHT has noticeably improved the quality of reconstructed images in comparison with those of CGI with random patterns and GIFT, and has obtained results similar to those of CGI with Walsh–Hadamard patterns. In addition, GIFWHT has efficiently removed interference of environmental illumination, and has saved much memory to record the patterns.

## Funding

National Natural Science Foundation of China (NSFC) (61271238, 61475075); Open research fund of Key Lab of Broadband Wireless Communication and Sensor Network Technology, Ministry of Education (NYKL2015011).

## REFERENCES

**1. **S. M. Zhao, H. Yang, Y. Q. Li, F. Cao, Y. B. Sheng, W. W. Cheng, and L. Y. Gong, “The influence of atmospheric turbulence on holographic ghost imaging using orbital angular momentum entanglement: simulation and experimental studies,” Opt. Commun. **294**, 223–228 (2013). [CrossRef]

**2. **N. Radwell, K. J. Mitchell, G. M. Gibson, M. P. Edgar, R. Bowman, and M. J. Padgett, “Single-pixel infrared and visible microscope,” Optica **1**, 285–289 (2014). [CrossRef]

**3. **D. V. Strekalov, A. V. Sergienko, D. N. Klyshko, and Y. H. Shih, “Observation of two-photon ‘ghost’ interference and diffraction,” Phys. Rev. Lett. **74**, 3600–3603 (1995). [CrossRef]

**4. **X. Gu and S. M. Zhao, “Nonorthogonal object identification based on ghost imaging,” Photon. Res. **3**, 238–242 (2015). [CrossRef]

**5. **R. S. Bennink, S. J. Bentley, and R. W. Boyd, “Two-photon coincidence imaging with a classical source,” Phys. Rev. Lett. **89**, 113601 (2002). [CrossRef]

**6. **R. Liu, A. Fang, Y. Zhou, P. Zhang, S. Gao, H. Li, H. Gao, and F. Li, “Enhanced visibility of ghost imaging and interference using squeezed thermal light,” Phys. Rev. A **93**, 013822 (2016). [CrossRef]

**7. **X. Li, C. Deng, M. Chen, W. Gong, and S. Han, “Ghost imaging for an axially moving target with an unknown constant speed,” Photon. Res. **3**, 153–157 (2015). [CrossRef]

**8. **D. J. Zhang, H. G. Li, Q. L. Zhao, S. Wang, H. B. Wang, J. Xiong, and K. Wang, “Wavelength-multiplexing ghost imaging,” Phys. Rev. A **92**, 013823 (2015). [CrossRef]

**9. **E. F. Zhang, W. T. Liu, and P. X. Chen, “Ghost imaging with non-negative exponential speckle patterns,” J. Opt. **17**, 085602 (2015). [CrossRef]

**10. **W. Gong, “High-resolution pseudo-inverse ghost imaging,” Photon. Res. **3**, 234–237 (2015). [CrossRef]

**11. **X. F. Liu, X. H. Chen, X. R. Yao, W. K. Yu, G. J. Zhai, and L. A. Wu, “Lensless ghost imaging with sunlight,” Opt. Lett. **39**, 2314–2317 (2014). [CrossRef]

**12. **L. Nie, Y. Bai, and X. Fu, “Ghost telescope imaging system from the perspective of coherent-mode representation,” Opt. Commun. **358**, 88–91 (2016). [CrossRef]

**13. **Y. Yang, J. Shi, F. Cao, J. Peng, and G. Zeng, “Computational imaging based on time-correlated single-photon-counting technique at low light level,” Appl. Opt. **54**, 9277–9283 (2015). [CrossRef]

**14. **J. H. Shapiro, “Computational ghost imaging,” Phys. Rev. A **78**, 061802(R) (2008). [CrossRef]

**15. **S. M. Zhao, L. Wang, W. Q. Liang, W. W. Cheng, and L. Y. Gong, “High performance optical encryption based on computational ghost imaging with QR code and compressive sensing technique,” Opt. Commun. **353**, 90–95 (2015). [CrossRef]

**16. **S. S. Welsh, M. P. Edgar, R. Bowman, B. Sun, and M. J. Padgett, “Near video-rate linear Stokes imaging with single-pixel detectors,” J. Opt. **17**, 025705 (2015). [CrossRef]

**17. **Z. Zhang, X. Ma, and J. S. Zhong, “Single-pixel imaging by means of Fourier spectrum acquisition,” Nat. Commun. **6**, 6225–6230 (2015). [CrossRef]

**18. **Y. A. Geadah and M. J. G. Corinthios, “Natural, dyadic, and sequency order algorithms and processors for the Walsh-Hadamard transform,” IEEE Trans. Comput. **C-26**, 435–442 (1977). [CrossRef]

**19. **M. Corinthios, *Signals, Systems, Transforms, and Digital Signal Processing with MATLAB* (CRC Press, 2009).

**20. **W. K. Yu, X. R. Yao, X. F. Liu, L. Z. Li, and G. J. Zhai, “Three-dimensional single-pixel compressive reflectivity imaging based on complementary modulation,” Appl. Opt. **54**, 363–367 (2015). [CrossRef]