Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Single-photon imaging over 200 km

Open Access Open Access

Abstract

Long-range active imaging has widespread applications in remote sensing and target recognition. Single-photon light detection and ranging (lidar) has been shown to have high sensitivity and temporal resolution. On the application front, however, the operating range of practical single-photon lidar systems is limited to about tens of kilometers over the Earth’s atmosphere, mainly due to the weak echo signal mixed with high background noise. Here, we present a compact coaxial single-photon lidar system capable of realizing 3D imaging at up to 201.5 km. It is achieved by using high-efficiency optical devices for collection and detection, and what we believe is a new noise-suppression technique that is efficient for long-range applications. We show that photon-efficient computational algorithms enable accurate 3D imaging over hundreds of kilometers with as few as 0.44 signal photons per pixel. The results represent a significant step toward practical, low-power lidar over extra-long ranges.

© 2021 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

1. INTRODUCTION

Single-photon lidar [13] based on time-correlated single photon counting (TCSPC) [4] can provide single-photon sensitivity and picosecond resolution for time-of-flight measurements [5], and has foreseen significant progress in experimental developments for various applications [621]. (See [22] for a review.) Single-photon lidar was originally proposed for laser rangers and altimeters [1,2], which have been transformative for long-range laser ranging [23] and global topography [24]. Geiger-mode single-photon lidar systems, originally developed by MIT Lincoln Laboratories [3], have been widely adopted for airborne and spaceborne topographic measurements [25]. These techniques have been made commercially available (e.g., Sigma Space Corp., Lanham, MD, US, a unit of Hexagon AB, Stockholm) for fast measurements with single-photon detector arrays. Single-photon lidar built in high-altitude airborne platforms can provide continuous topographic and bathymetric mapping [26]. For large-scale topography, a satellite-based laser altimeter, such as ATLAS on ICESat-2 [24], has been adopted for long-term observations of the Earth’s surface.

In recent years, single-photon lidar for long-range imaging over the terrestrial (Earth’s) atmosphere has received a lot of research attention [8,1021], since it can provide high-resolution, three-dimensional (3D) imaging in both transversal and longitudinal dimension (at centimeter scale). This capability plays an important role in target recognition and identification over long ranges. Meanwhile, advanced computational algorithms have permitted 3D imaging with a small number of photons (i.e., one photon per pixel) [2734]. The combination of the single-photon lidar and photon-efficient imaging algorithms has led to the realization of long-range imaging at tens of kilometers [16,17]. Despite this remarkable progress, further extending the distance remains challenging. In long-range scenarios along the Earth’s atmosphere, the number of echo signal photons from the scene of interest decreases quadratically with the distance [35], while the background noise (mainly attributed to the backscatters from the near-field atmosphere) is linearly correlated with the output laser power. Even with a high-power laser, the limited signal-to-background ratio (SBR) in long ranges prevents useful reconstructions. Based on state-of-the-art algorithms [31,32], the previous single-photon imaging experiments over the terrestrial atmosphere [16,17] are typically limited to an achievable range below 100 kilometers, as shown in the table in Supplement 1.

Here we demonstrate a high-efficiency, low-noise coaxial single-photon lidar system that can realize 3D imaging at up to 201.5 km with as few as 0.44 signal photons per pixel (PPP). This is achieved by the development of an optimized transceiver optics, a low-noise InGaAs/InP single-photon avalanche diode detector (SPAD), and an efficient noise-suppression technique. The validity of different photon-efficient imaging algorithms [3033] has been verified under this extremely low-light-flux condition. Furthermore, we demonstrate photon-efficient laser ranging over 100 kilometers, and show that the range of noncooperative targets can be accurately measured with 4.02 signal photon counts only. The demonstration of imaging and ranging with a small number of photons over 200-km range are desirable for the general application of target recognition and satellite-based topography.

 figure: Fig. 1.

Fig. 1. Illustration of the long-range active imaging over 201.5 km. Satellite image of the experiment implemented near the city of Urumqi, China, where the single-photon lidar is placed at a temporary laboratory in the wild. (a) Visible-band photograph of the mountains taken by a standard astronomical camera equipped with a telescope. The elevation is approximately 4500 m. (b) Schematic diagram of the experimental setup: SM, scanning mirror; Cam, camera; M, mirror; PERM, 45° perforated mirror; SPAD, single-photon avalanche diode; MMF, multimode fiber; Col, collimator; AOM, acoustic-optic modulator; F, spectral filter; FF, fiber spectral filter; and SMF, single mode fiber. (c) Photograph of the setup hardware, including the optical system (top and bottom left) and the electronic control system (bottom right). (d) View of the temporary laboratory at an altitude of 1770 m. The elevation profile of the terrain between the setup and the targets is shown in Supplement 1, Fig. S1.

Download Full Size | PDF

2. SINGLE-PHOTON LIDAR SETUP

As shown in Fig. 1, the single-photon lidar system primarily uses commercial off-the-shelf devices that operate at room temperature, where the source is a standard fiber laser and the detector is a compact low-noise InGaAs/InP SPAD. The transceiver system consists of a commercial Cassegrain telescope with a modest telescope aperture of 280 mm and a custom-built integrated optical platform mounted on a homemade two-axis rotating stage. To obtain high atmospheric transmittance and low solar background in eye-safe wavelengths, we choose a near-infrared (NIR) wavelength of 1550 nm. For the source, we employ an all-fiber pulsed erbium-doped fiber laser (600 ps pulse width and 500 kHz clock rate), where microjoule pulses (with a maximum laser power of 600 mW) and an expanded beam design (with a diameter of 7 cm for the beam emitted from the telescope) are adopted to meet the eye safety standards [36], the systems falls within laser eye safety thresholds. The laser is coupled to a single-mode fiber before transmission to encompass the divergence for illumination.

To balance the ambient light rejection and coupling efficiency, the returned photons are coupled into a multimode fiber with an appropriate core diameter (62.5 µm) for high-efficiency collection. Similar to previous single-photon lidar systems [11,12,16], our setup uses a coaxial scanning design (see Fig. 1) for the transmit and receive optical paths, rather than a conventional dual-telescope configuration [24]. This configuration can precisely align the transmitting and receiving spots projected on the target over dynamical distances, and facilitate fine scanning to achieve high-resolution imaging. High-precision scanning is implemented by a closed-loop coplanar dual-axis piezo tip-tilt platform in both $x$ and $y$ axial directions.

3. OPTICAL TECHNIQUES TAILORED FOR LONG RANGES

To facilitate 3D imaging over long ranges beyond tens of kilometers, we develop high-efficiency optical devices and an efficient noise-suppression method to improve the collection efficiency and decrease background noise. We believe, to the best of our knowledge that the SBR of our system is much better than the previous works [16,17]. A detailed comparison of the performance is shown by Table S1 and Fig. S4 in Supplement 1.

A. Improving the Collection Efficiency

First, we develop an InGaAs/InP SPAD [37,38] with a detection efficiency of 19.3%, a time jitter of ${\sim}{180}\;{\rm ps}$, and a dark count rate as low as 0.1 KHz. We use a lightweight (2 kg) and low-power (peak power 60 W) thermoacoustic cooler to cool negative feedback avalanche diode (NFAD) devices down to 173 K to achieve a low dark count rate (DCR) [37]. Moreover, the NFAD is coupled by a multimode fiber with a core diameter of 62.5 µm to enhance collection efficiency. In the experiment, the SPAD is optimized to ${\sim}{100}$ cps DCR at 19.3% PDE. Because of the low temperature for the SPAD, the DCR is about 20 times lower than previous long-range single-photon lidar systems [16,17]. Second, the telescope is coated to achieve high transmission at 1550 nm. The coating materials, which include ${{\rm SiO}_2}$ and ${{\rm Ga}_2}{{\rm O}_5}$, are accumulated on the lenses by means of ion-beam assisted deposition. Compared to the ordinary commercial telescope (which is coated for visible light), the total transmission is increased by two times (i.e., from about 50% to 95%). Finally, the transmitting divergence angle and the receiving field of view (FoV) are designed to be 17.8 µrad and 11.2 µrad, respectively. These angles are twice as small than what was used in previous works [16,17], and they are close to the diffraction limit of the telescope aperture. Note that the smaller FoV can reduce the ambient light by a factor of 75%, but also provides increased resolution. Moreover, it allows narrow photon clusters in temporal domain, thus mitigating the issue of multiple returns [17,33,39].

 figure: Fig. 2.

Fig. 2. Schematic diagram of the noise-suppression technique. (a) The arbitrary function generator (AFG) triggers the laser and the delayer. An acoustic-optic modulator (AOM) is used to block the ASE noise when the laser is triggered off. A delayer provides electronic signal (E-signal) to the SPAD and AOM. (b) The timing diagram in an operational period $T$. During the emission mode $R$, the laser pulses are emitted, and the AOM is on to allow pulse emission. At the end of $R$, the AOM is switched off to block ASE noise. After an isolation period $W$, the SPAD is gated on to detect the back-reflecting single photons for a detection period of $D$.

Download Full Size | PDF

B. Noise-Suppression Technique

In long-range lidar, the number of echo signal photons is highly limited [2732], which imposes a stringent requirement for the system’s background noise level. By using the low-noise SPAD, we characterize the background noise and find that the noise mainly comes from two types of backscattering noise. The first type is the backscatter of the laser pulse from the near-field atmosphere and the common transmitting/receiving optics elements. Particularly, the atmospheric scattering lasts a long time after each pulse emission. The second one is the backscatter of the amplified spontaneous emission (ASE) noise. The ASE noise, which is unavoidable due to the required optical amplification, occurs in the full temporal domain of the detection. (See Supplement 1 for details.) These two types of noise cannot be removed by a conventional gating operation of the SPAD [11,12,16].

In contrast, we developed an efficient temporal filtering approach for noise suppression. In our approach, which is shown in Fig. 2, we set a temporal separation of the emission mode ($R$) and the detection mode ($D$), and employ an additional high-extinction-ratio acoustic optical modulator (AOM) to realize fast switching between these two modes. In the emission mode, the laser pulses are triggered at a high repetition rate, while the SPAD is turned off using electrical gating. In the detection mode, the laser stops emitting pulses and the SPAD is turned on. This can eliminate the local noise from the pulse emission (originally 2.4 nW coupled into the SPAD), and reduce the noise by a factor of 100. Note that even when the laser pulses are not triggered during the detection mode, the optical amplifier will still contribute ASE noise (${\sim}{10^9}$ photons/s coupled). Therefore, we use the AOM to efficiently isolate the ASE noise with an extinction ratio of about 57 dB. We also set an transition time of $W$ in Fig. 2 between the emission mode and the detection mode to further subside near-field atmospheric reflection. For phase settings (R, W, D), taking 200 km imaging as an example, since the round-trip flight time of photons is 1.34 ms, we set the emission mode at period [0, 1.2 ms] and the detection mode at 1.3 ms, 2.5 ms to achieve an optimized efficiency.

The total number of noise photon counts is quantified to be about 0.4 KHz, which is at least 50 times smaller than previous works [16,17], as shown in Supplement 1, Fig. S3. This enhancement is mainly due to three factors: the time-gating approach, the low noise SPAD, and the smaller FoV. The ultralow noise is the key enabling feature to image and range over hundreds of kilometers.

4. RESULTS

Using our system, we perform an in-depth study to image a variety of natural scenes over long ranges. Most of the experiments were done in a wild environment at night near the city of Urumqi, China. Note that we also have the ability to image in the daytime, as shown in Supplement 1, Fig. S2. In our experiment, we perform blind lidar measurements without any prior information of the absolute time location of returned signals, where the SPAD is free-running in the detection mode. Our imaging captures the relative depth of scenes, while standard laser ranging measures the absolute distance. Depth maps of the targets up to 201.5 km were reconstructed with ${\sim}{1}$ PPP for signal photons and a SBR as low as 0.04. In our experiment, the emission laser power is fixed at 600 mW, and the targets are raster-scanned at different dwell times (depending on the imaging ranges) to form 3D images. Next, we present three representative results and show further results in Supplement 1, Fig. S3.

 figure: Fig. 3.

Fig. 3. Reconstruction results for a tower over 9.8 km. (a) Real visible-band photo taken with a standard astronomical camera. (b), (c), (e), and (f) The reconstructed depth results by different photon-efficient algorithms, including Shin et al. [30], Rapp et al. [31], Li et al. [17], and Lindell et al. [32]. (d) 3D demonstration of the reconstructed result from Lindell’s method. The SBR is ${\sim}{15.76}$ and the mean signal PPP is ${\sim}{3.47}$. Note that a ground truth is generated by Lindell’s method from a dataset with a 35-signal PPP for the PSNR’s calculation.

Download Full Size | PDF

 figure: Fig. 4.

Fig. 4. Reconstruction results for a scene over 124.0 km. (a) Real visible-band photo. (b) The reconstructed depth result by Lindell et al. 2018 [32] for the data with SBR ${\sim}\;{0.51}$ and mean signal PPP ${\sim}\;{3.86}$. (c) A 3D profile of the reconstructed depth map.

Download Full Size | PDF

 figure: Fig. 5.

Fig. 5. Reconstruction results of a scene over 201.5 km. (a) Real visible-band photo. (b) The reconstructed depth result by Lindell et al. in 2018 [32] for the data with SBR ${\sim}\;{0.04}$ and mean signal PPP ${\sim}\;{3.58}$. (c) A 3D profile of the reconstructed result.

Download Full Size | PDF

First, to demonstrate the capability of high-resolution 3D imaging, we image a tower head over a range about 9.8 km with ${160} \times {160}\;{\rm pixels}$ and an acquisition time of 1.5 ms per pixel. The results are shown in Fig. 3. Our setup permits a rather high SBR up to 15.76 (31.80) for the whole picture (on those non-empty pixels), which is about 50 times higher than previous results [16,17]. The fine details of the target can be clearly reconstructed with a signal level of about 3.47 (7.01) signal PPP. From the experimental data, we verify various photon-efficient imaging algorithms [17,3032], as exhibited in Figs. 3(b), 3(c), 3(e), and 3(f), where the quantitative performances for each algorithm are given with peak-signal-to-noise-ratios (PSNRs). The results show that despite different performances, most of the algorithms are effective to resolve the 3D shape, even in low light. For the algorithms based on convex optimization [17,30,31], the parameters are the same as the ones reported in previous research. To make a fair comparison, the regularizer (TV norm) is set to 0.1. For the learning approach [32], we import the pretrained CNN directly, and apply it to the raw data. The processing times by a standard laptop for these algorithms are 39.60 s [30], 77.54 s [31], 112.05 s [17], and 9.60 s [32], respectively.

Next, we choose a mountain scene 124.0 km away. Figure 4(a) shows the visible-band photograph taken by a standard astronomical camera (ASI294MC) equipped with a telescope. We scan ${256} \times {256}\;{\rm pixels}$ with an acquisition time of 32.0 ms per pixel, and obtain a SBR of 0.33 (0.51) and an averaged signal PPP of 2.91 (3.86) for the whole picture (or those non-empty pixels). A depth map reconstructed from [32] and a 3D profile are shown in Figs. 4(b) and 4(c).

To demonstrate long-range imaging, we choose another target 201.5 km away. We did the imaging in weather with a high visibility of about 100 kilometers. Figure 5(a) shows the visible-band photograph. We raster-scan the target with ${320} \times {512}\;{\rm pixels}$ at an acquisition time of 189.7 ms per pixel. For the observed data, the SBR is 0.03 (0.04) and the averaged number of signal PPP is 2.23 (3.58) for the whole picture (or those non-empty pixels). A depth map reconstructed by [32] and a 3D profile is shown in Figs. 5(b) and 5(c). In Supplement 1, Fig. S2, we further show that our lidar system can resolve the 3D details of the long-range scene even with 0.44 signal PPP.

Note that the reconstructed depth imaging is the relative depth information. For complementary purpose, our lidar system also can handle long-range laser ranging [23,4042], which can provide accurate absolute distances. To show this capability, we demonstrate the laser ranging over hundreds of kilometers in a terrestrial atmosphere. Lidar systems with a high-repetition-rate laser are favored for laser ranging [5], but the short laser period will cause range ambiguity for long-range applications. Previous works have proposed solutions for ranging over tens of kilometers [43,44]. Using our low-noise, single-photon lidar system, we report accurate laser ranging over hundreds of kilometers with a small number of signal returns. This is achieved by measurements with multiple repetition rates [45] and a photon-efficient approach for distance estimation (see Supplement 1). Figure 6(a) shows the results for laser ranging of a retroreflector over 113.6 km, where the system’s timing jitter is characterized to be 600 Ps full width at half-maximum (FWHM) as shown in Fig. 6(b). In Fig. 6(c), we show the ranging results with three different repetition rates for a noncooperation mountain scene. With an acquisition time of 1.5 s for each measurement and a total number of 30 measurements, we get the absolute distance of 163.337 km with an precision of 3.5 cm. This precision is mainly due to the large FoV, which may cover nonsmooth multiple depths in each measurement, as shown in Supplement 1. Finally, in Fig. 6(d), we demonstrate the photon-efficient ranging by measuring a mountain top with a total number of ${\sim}{4.04}$ signal photon counts only (within a total acquisition time of 50 ms). The absolute distance is measured to be 162.476 km. This result shows that our system is capable of laser ranging at 20 Hz over a hundred kilometers using only a small number of photons.

 figure: Fig. 6.

Fig. 6. Ranging over 100 kilometers. (a) Histograms of raw data from a retroreflector at absolute distance $L = 113.6\;{\rm km} $ with $T = 1.5\;{\rm ms}$ and $R = 0.7\;{\rm ms}$. (b) The shape of the echo signal peak, from which the system jitter can be assessed as ${\rm FWHM} = {600}\;{\rm Ps}$. (c) Histogram of the ranging experiment over 163.3 km with three different repetition rates for a noncooperation mountain scene. (d) Histogram of the measured signals in the photon-efficient ranging experiment, where the averaged number of signal photon counts is about 4.04.

Download Full Size | PDF

5. DISCUSSION AND CONCLUSION

Although space-borne laser altimetry based on single-photon lidar, like ATLAS on the ICESAt-2 [24] has been reported, it is operated with large field of view (FoV) and wide separation between laser beams. However, we focus on single-photon lidar for target recognition along a terrestrial atmosphere, where one important consideration is high imaging resolution. Hence, our system adopts a small FoV and fine scanning for high-resolution imaging. Moreover, for satellite-based laser altimeters, the equivalent vertical thickness of the atmosphere is about 5–10 km (i.e., most of the light path is a vacuum). In contrast, our experiment is operated over Earth’s atmosphere, where the main challenges are the strong atmospheric attenuation and ambient noise from the near-field atmospheric backscattering. To conquer these issues, we enhance the system efficiency and adopt a new time-gating scheme to efficiently suppress the background noise, as discussed in Section 3, striving to achieve high SBR (see Supplementary Table S1). Furthermore, we demonstrate accurate 3D imaging with a small number of photons (i.e., one photon per pixel). This sensitivity outperforms previous space-borne lidars. Nevertheless, the space-borne lidars present an array of multiple beamlets, which have the advantage of higher data collection rates and easy operation over much longer distances.

To sum up, we experimentally demonstrate single-photon 3D imaging and ranging up to 201.5 km over a terrestrial atmosphere with a small number of photons. We have proposed an effective solution for the noise reduction that uses a single-photon lidar with a high SBR. Our system adopts a compact design with a low-power laser and commercial off-the-shelf components, and our techniques are photon-efficient. We believe the photon-efficient algorithms and the low-noise techniques we developed could facilitate the system’s adaptation for the use in future multibeam single-photon lidar systems with Geiger-mode SPAD arrays for rapid remote sensing [3,26]. The developed techniques are applicable to other wavelengths (e.g., 532 nm), which permits water penetration capability. By adopting a high-power laser and high-efficiency SPAD array, multibeam single-photon lidar also may be feasible for fast imaging without scanning. Overall, our results may provide enhanced methods for low-power, single-photon lidar mounted on low-earth-orbit [24] or nano satellites, as a complement to traditional imaging, for high-resolution active imaging and sensing over long ranges.

Funding

National Key Research and Development Program of China (2018YFB0504300); National Natural Science Foundation of China (62031024, 61771443); Shanghai Municipal Science and Technology Major Project (2019SHZDZX01); Key-Area Research and Development Program of Guangdong Province (2020B0303020001); Anhui Initiative in Quantum Information Technologies; Shanghai Science and Technology Development Foundation (18JC1414700).

Disclosures

The authors declare no competing financial interests.

Data availability

The processing code and data can be seen in GitHub [46].

Supplemental document

See Supplement 1 for supporting content.

REFERENCES

1. J. J. Degnan, “Unified approach to photon-counting microlaser rangers, transponders, and altimeters,” Surv. Geophys. 22, 431–447 (2001). [CrossRef]  

2. J. J. Degnan, “Photon-counting multikilohertz microlaser altimeters for airborne and spaceborne topographic measurements,” J. Geodyn. 34, 503–549 (2002). [CrossRef]  

3. R. M. Marino and W. R. Davis, “Jigsaw: a foliage-penetrating 3D imaging laser radar system,” Linc. Lab. J. 15, 23–36 (2005).

4. R. H. Hadfield, “Single-photon detectors for optical quantum information applications,” Nat. Photonics 3, 696–705 (2009). [CrossRef]  

5. G. Buller and A. Wallace, “Ranging and three-dimensional imaging using time-correlated single-photon counting and point-by-point acquisition,” IEEE J. Sel. Top. Quantum Electron. 13, 1006–1015 (2007). [CrossRef]  

6. F. Zappa, S. Tisa, A. Tosi, and S. Cova, “Principles and features of single-photon avalanche diode arrays,” Sens. Actuators A, Phys. 140, 103–112 (2007). [CrossRef]  

7. B. Sun, M. P. Edgar, R. Bowman, L. E. Vittert, S. Welsh, A. Bowman, and M. Padgett, “3D computational imaging with single-pixel detectors,” Science 340, 844–847 (2013). [CrossRef]  

8. F. Villa, R. Lussana, D. Bronzi, S. Tisa, A. Tosi, F. Zappa, A. Dalla Mora, D. Contini, D. Durini, S. Weyers, and W. Brockherde, “CMOS imager with 1024 SPADS and TDCS for single-photon timing and 3-D time-of-flight,” IEEE J. Sel. Top. Quantum Electron. 20, 364–373 (2014). [CrossRef]  

9. G. Gariepy, N. Krstajić, R. Henderson, C. Li, R. R. Thomson, G. S. Buller, B. Heshmat, R. Raskar, J. Leach, and D. Faccio, “Single-photon sensitive light-in-fight imaging,” Nat. Commun. 6, 6021 (2015). [CrossRef]  

10. M. Laurenzis, F. Christnacher, and D. Monnin, “Long-range three-dimensional active imaging with superresolution depth mapping,” Opt. Lett. 32, 3146–3148 (2007). [CrossRef]  

11. A. McCarthy, R. J. Collins, N. J. Krichel, V. Fernández, A. M. Wallace, and G. S. Buller, “Long-range time-of-flight scanning sensor based on high-speed time-correlated single-photon counting,” Appl. Opt. 48, 6241–6251 (2009). [CrossRef]  

12. A. McCarthy, N. J. Krichel, N. R. Gemmell, X. Ren, M. G. Tanner, S. N. Dorenbos, V. Zwiller, R. H. Hadfield, and G. S. Buller, “Kilometer-range, high resolution depth imaging via 1560 nm wavelength single-photon detection,” Opt. Express 21, 8904–8915 (2013). [CrossRef]  

13. H. Zhou, Y. He, L. You, S. Chen, W. Zhang, J. Wu, Z. Wang, and X. Xie, “Few-photon imaging at 1550 nm using a low-timing-jitter superconducting nanowire single-photon detector,” Opt. Express 23, 14603–14611 (2015). [CrossRef]  

14. M. Laurenzis, J. Klein, E. Bacher, and N. Metzger, “Multiple-return single-photon counting of light in flight and sensing of non-line-of-sight objects at shortwave infrared wavelengths,” Opt. Lett. 40, 4815–4818 (2015). [CrossRef]  

15. Z. Li, E. Wu, C. Pang, B. Du, Y. Tao, H. Peng, H. Zeng, and G. Wu, “Multi-beam single-photon-counting three-dimensional imaging lidar,” Opt. Express 25, 10189–10195 (2017). [CrossRef]  

16. A. M. Pawlikowska, A. Halimi, R. A. Lamb, and G. S. Buller, “Single-photon three-dimensional imaging at up to 10 kilometers range,” Opt. Express 25, 11919–11931 (2017). [CrossRef]  

17. Z.-P. Li, X. Huang, Y. Cao, B. Wang, Y.-H. Li, W. Jin, C. Yu, J. Zhang, Q. Zhang, C.-Z. Peng, F. Xu, and J.-W. Pan, “Single-photon computational 3d imaging at 45 km,” Photon. Res. 8, 1532–1540 (2020). [CrossRef]  

18. X. Ren, P. W. Connolly, A. Halimi, Y. Altmann, S. McLaughlin, I. Gyongy, R. K. Henderson, and G. S. Buller, “High-resolution depth profiling using a range-gated CMOS SPAD quanta image sensor,” Opt. Express 26, 5541–5557 (2018). [CrossRef]  

19. S. Chan, A. Halimi, F. Zhu, I. Gyongy, R. K. Henderson, R. Bowman, S. McLaughlin, G. S. Buller, and J. Leach, “Long-range depth imaging using a single-photon detector array and non-local data fusion,” Sci. Rep. 9, 1–10 (2019). [CrossRef]  

20. J. Tachella, Y. Altmann, N. Mellado, A. McCarthy, R. Tobin, G. S. Buller, J.-Y. Tourneret, and S. McLaughlin, “Real-time 3D reconstruction from single-photon lidar data using plug-and-play point cloud denoisers,” Nat. Commun. 10, 4984 (2019). [CrossRef]  

21. Z.-P. Li, X. Huang, P.-Y. Jiang, Y. Hong, C. Yu, Y. Cao, J. Zhang, F. Xu, and J.-W. Pan, “Super-resolution single-photon imaging at 8.2 kilometers,” Opt. Express 28, 4076–4087 (2020). [CrossRef]  

22. Y. Altmann, S. McLaughlin, M. J. Padgett, V. K. Goyal, A. O. Hero, and D. Faccio, “Quantum-inspired computational imaging,” Science 361, eaat2298 (2018). [CrossRef]  

23. J. O. Dickey, P. Bender, J. Faller, X. Newhall, R. Ricklefs, J. Ries, P. Shelus, C. Veillet, A. Whipple, J. Wiant, J. G. Williams, and C. F. Yoder, “Lunar laser ranging: a continuing legacy of the Apollo program,” Science 265, 482–490 (1994). [CrossRef]  

24. A. Neuenschwander and K. Pitts, “The ATL08 land and vegetation product for the ICESat-2 mission,” Remote Sens. Environ. 221, 247–259 (2019). [CrossRef]  

25. C. L. Glennie, W. E. Carter, R. L. Shrestha, and W. E. Dietrich, “Geodetic imaging with airborne lidar: the earth’s surface revealed,” Rep. Prog. Phys. 76, 086801 (2013). [CrossRef]  

26. J. J. Degnan, “Scanning, multibeam, single photon lidars for rapid, large scale, high resolution, topographic and bathymetric mapping,” Remote Sens. 8, 958 (2016). [CrossRef]  

27. A. Kirmani, D. Venkatraman, D. Shin, A. Colaço, F. N. Wong, J. H. Shapiro, and V. K. Goyal, “First-photon imaging,” Science 343, 58–61 (2014). [CrossRef]  

28. Y. Altmann, X. Ren, A. McCarthy, G. S. Buller, and S. McLaughlin, “Lidar waveform-based analysis of depth images constructed using sparse single-photon data,” IEEE Trans. Image Process. 25, 1935–1946 (2016). [CrossRef]  

29. D. Shin, F. Xu, D. Venkatraman, R. Lussana, F. Villa, F. Zappa, V. K. Goyal, F. N. Wong, and J. H. Shapiro, “Photon-efficient imaging with a single-photon camera,” Nat. Commun. 7, 12046 (2016). [CrossRef]  

30. D. Shin, A. Kirmani, V. K. Goyal, and J. H. Shapiro, “Photon-efficient computational 3-D and reflectivity imaging with single-photon detectors,” IEEE Trans. Comput. Imaging 1, 112–125 (2015). [CrossRef]  

31. J. Rapp and V. K. Goyal, “A few photons among many: Unmixing signal and noise for photon-efficient active imaging,” IEEE Trans. Comput. Imaging 3, 445–459 (2017). [CrossRef]  

32. D. B. Lindell, M. O’Toole, and G. Wetzstein, “Single-photon 3D imaging with deep sensor fusion,” ACM Trans. Graph. 37, 113 (2018). [CrossRef]  

33. J. Tachella, Y. Altmann, X. Ren, A. McCarthy, G. S. Buller, S. McLaughlin, and J.-Y. Tourneret, “Bayesian 3D reconstruction of complex scenes from single-photon lidar data,” SIAM J. Imaging Sci. 12, 521–550 (2019). [CrossRef]  

34. J. Peng, Z. Xiong, X. Huang, Z.-P. Li, D. Liu, and F. Xu, “Photon-efficient 3D imaging with a non-local neural network,” in European Conference on Computer Vision (ECCV), A. Vedaldi, H. Bischof, T. Brox, and J.-M. Frahm, eds. (Springer, 2020), pp. 225–241.

35. W. Wagner, A. Ullrich, V. Ducic, T. Melzer, and N. Studnicka, “Gaussian decomposition and calibration of a novel small-footprint full-waveform digitising airborne laser scanner,” ISPRS J. Photogramm. Remote Sens. 60, 100–112 (2006). [CrossRef]  

36. American National Standards Institute, “Safe use of lasers,” ANSI Z136.1-2014 (2000).

37. C. Yu, M. Shangguan, H. Xia, J. Zhang, X. Dou, and J. W. Pan, “Fully integrated free-running InGaAs/InP single-photon detector for accurate lidar applications,” Opt. Express 25, 14611–14620 (2017). [CrossRef]  

38. B. Korzh, N. Walenta, T. Lunghi, N. Gisin, and H. Zbinden, “Free-running InGaAs single photon detector with 1 dark count per second at 10% efficiency,” Appl. Phys. Lett. 104, 081108 (2014). [CrossRef]  

39. S. Hernandez-Marin, A. M. Wallace, and G. J. Gibson, “Bayesian analysis of lidar signals with multiple returns,” IEEE Trans. Pattern Anal. Mach. Intell. 29, 2170–2180 (2007). [CrossRef]  

40. M. Wilkinson, U. Schreiber, I. Procházka, C. Moore, J. Degnan, G. Kirchner, Z. Zhongping, P. Dunn, V. Shargorodskiy, M. Sadovnikov, C. Courde, and H. Kunimori, “The next generation of satellite laser ranging systems,” J. Geodes. 93, 2227–2247 (2019). [CrossRef]  

41. J. J. Degnan, “Laser transponders for high-accuracy interplanetary laser ranging and time transfer,” in Lasers, Clocks and Drag-Free Control (Springer, 2008), pp. 231–242.

42. J. J. Degnan, “Asynchronous laser transponders for precise interplanetary ranging and time transfer,” J. Geodyn. 34, 551–594 (2002). [CrossRef]  

43. B. Du, C. Pang, D. Wu, Z. Li, H. Peng, Y. Tao, E. Wu, and G. Wu, “High-speed photon-counting laser ranging for broad range of distances,” Sci. Rep. 8, 4198 (2018). [CrossRef]  

44. N. J. Krichel, A. McCarthy, and G. S. Buller, “Resolving range ambiguity in a photon counting depth imager operating at kilometer distances,” Opt. Express 18, 9192–9206 (2010). [CrossRef]  

45. Y. Liang, J. Huang, M. Ren, B. Feng, X. Chen, E. Wu, G. Wu, and H. Zeng, “1550-nm time-of-flight ranging system employing laser with multiple repetition rates for reducing the range ambiguity,” Opt. Express 22, 4662–4670 (2014). [CrossRef]  

46. Z.-P. Li, J.-T. Ye, X. Huang, P.-Y. Jiang, Y. Cao, Y. Hong, C. Yu, J. Zhang, Q. Zhang, C.-Z. Peng, F. Xu, and J.-W. Pan, “Quantum-inspired lidar/Long-range single-photon imaging over 200 km,” GitHub (2021) https://github.com/quantum-inspired-lidar/Long-range-single-photon-imaging-over-200-km.

Supplementary Material (1)

NameDescription
Supplement 1       This document provides supplementary information to “Single-photon imaging over 200 km”.

Data availability

The processing code and data can be seen in GitHub [46].

46. Z.-P. Li, J.-T. Ye, X. Huang, P.-Y. Jiang, Y. Cao, Y. Hong, C. Yu, J. Zhang, Q. Zhang, C.-Z. Peng, F. Xu, and J.-W. Pan, “Quantum-inspired lidar/Long-range single-photon imaging over 200 km,” GitHub (2021) https://github.com/quantum-inspired-lidar/Long-range-single-photon-imaging-over-200-km.

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (6)

Fig. 1.
Fig. 1. Illustration of the long-range active imaging over 201.5 km. Satellite image of the experiment implemented near the city of Urumqi, China, where the single-photon lidar is placed at a temporary laboratory in the wild. (a) Visible-band photograph of the mountains taken by a standard astronomical camera equipped with a telescope. The elevation is approximately 4500 m. (b) Schematic diagram of the experimental setup: SM, scanning mirror; Cam, camera; M, mirror; PERM, 45° perforated mirror; SPAD, single-photon avalanche diode; MMF, multimode fiber; Col, collimator; AOM, acoustic-optic modulator; F, spectral filter; FF, fiber spectral filter; and SMF, single mode fiber. (c) Photograph of the setup hardware, including the optical system (top and bottom left) and the electronic control system (bottom right). (d) View of the temporary laboratory at an altitude of 1770 m. The elevation profile of the terrain between the setup and the targets is shown in Supplement 1, Fig. S1.
Fig. 2.
Fig. 2. Schematic diagram of the noise-suppression technique. (a) The arbitrary function generator (AFG) triggers the laser and the delayer. An acoustic-optic modulator (AOM) is used to block the ASE noise when the laser is triggered off. A delayer provides electronic signal (E-signal) to the SPAD and AOM. (b) The timing diagram in an operational period $T$. During the emission mode $R$, the laser pulses are emitted, and the AOM is on to allow pulse emission. At the end of $R$, the AOM is switched off to block ASE noise. After an isolation period $W$, the SPAD is gated on to detect the back-reflecting single photons for a detection period of $D$.
Fig. 3.
Fig. 3. Reconstruction results for a tower over 9.8 km. (a) Real visible-band photo taken with a standard astronomical camera. (b), (c), (e), and (f) The reconstructed depth results by different photon-efficient algorithms, including Shin et al. [30], Rapp et al. [31], Li et al. [17], and Lindell et al. [32]. (d) 3D demonstration of the reconstructed result from Lindell’s method. The SBR is ${\sim}{15.76}$ and the mean signal PPP is ${\sim}{3.47}$. Note that a ground truth is generated by Lindell’s method from a dataset with a 35-signal PPP for the PSNR’s calculation.
Fig. 4.
Fig. 4. Reconstruction results for a scene over 124.0 km. (a) Real visible-band photo. (b) The reconstructed depth result by Lindell et al. 2018 [32] for the data with SBR ${\sim}\;{0.51}$ and mean signal PPP ${\sim}\;{3.86}$. (c) A 3D profile of the reconstructed depth map.
Fig. 5.
Fig. 5. Reconstruction results of a scene over 201.5 km. (a) Real visible-band photo. (b) The reconstructed depth result by Lindell et al. in 2018 [32] for the data with SBR ${\sim}\;{0.04}$ and mean signal PPP ${\sim}\;{3.58}$. (c) A 3D profile of the reconstructed result.
Fig. 6.
Fig. 6. Ranging over 100 kilometers. (a) Histograms of raw data from a retroreflector at absolute distance $L = 113.6\;{\rm km} $ with $T = 1.5\;{\rm ms}$ and $R = 0.7\;{\rm ms}$. (b) The shape of the echo signal peak, from which the system jitter can be assessed as ${\rm FWHM} = {600}\;{\rm Ps}$. (c) Histogram of the ranging experiment over 163.3 km with three different repetition rates for a noncooperation mountain scene. (d) Histogram of the measured signals in the photon-efficient ranging experiment, where the averaged number of signal photon counts is about 4.04.
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.