Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Long-range remote focusing by image-plane aberration correction

Open Access Open Access

Abstract

Laser scanning plays an important role in a broad range of applications. Toward 3D aberration-free scanning, a remote focusing technique has been developed for high-speed imaging applications. However, the implementation of remote focusing often suffers from a limited axial scan range as a result of unknown aberration. Through simple analysis, we show that the sample-to-image path length conservation is crucially important to the remote focusing performance. To enhance the axial scan range, we propose and demonstrate an image-plane aberration correction method. Using a static correction, we can effectively improve the focus quality over a large defocusing range. Experimentally, we achieved ∼three times greater defocusing range than that of conventional methods. This technique can broadly benefit the implementations of high-speed large-volume 3D imaging.

© 2020 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

1. Introduction

Laser scanning has been employed in many key technologies. In manufacturing [1], laser scanning based cutting has been routinely used to achieve high spatial precision. In 3D printing [2], laser scanning based methods are employed for printing metal structures. In hospitals, laser scanning has been applied to vision correction [3] and novel endoscopes [4]. In biomedical sciences, a variety of 3D imaging technologies are based on laser scanning [510]. In remote sensing, laser scanning enables LIDAR technology [11]. Therefore, the advances of laser scanning techniques are crucially important to many disciplines.

The commonly employed laser scanners such as galvo, MEMS mirror, and polygon scanners can only implement fast 2D transverse scanning [12]. The axial scanning can be implemented by sample translation, lens translation, or mechanical lens focus adjustment, which are rather slow compared to the transverse scanning. For fast 3D scanning applications, a variety of novel solutions have been developed, which include liquid lenses [13,14], ultrasonic lenses [15,16], wavefront control [1719], remote focusing [20,21], etc. Liquid lenses are simple and of low cost. The working principle is based on the lens surface deformation, which allows a scanning rate of ∼100Hz. For high-resolution and high-quality imaging, the perfect defocusing wavefront is defined by ndcosθ where n is the refraction index, d is the defocusing distance and θ is the angle of incidence [20]. As the objective lenses are designed and manufactured to meet the Abbe sine condition (distance to optical axis proportional to sinθ on pupil plane), the pupil-plane spatial profile of the defocusing wavefront is uniquely defined for a given maximum half cone angle. As a result, the liquid lenses induced defocusing will contain aberration (deviation from the ideal defocusing wavefront) and lead to reduced Strehl ratio and spatial resolution. Ultrasonic lenses are based on the ultrasound pressure wave driven refractive index change, which is rather fast (e.g. 100 kHz – 1MHz). Similar to liquid lenses, ultrasonic lenses also suffer from aberration [15]. To achieve aberration free defocusing, holographic wavefront control has been employed [18,19,2224]. In addition to producing the perfect defocusing wavefront, wavefront control can also correct the existing system aberration and yield near diffraction limited focus [17,2528]. The defocusing range is often limited by the wavefront control device. For long-range defocusing, spatial light modulators (SLM) are often employed for their large number of pixels [29,30]. For applications involving femtosecond pulses, the inherent temporal coherence of the laser pulses sets an additional limit on the defocusing range as the SLM based wavefront control involves phase wrapping [31,32]. The state-of-the-art SLM offers a ∼200Hz update rate, which allows quick stepwise defocus adjustment. For continuous axial scanning, the SLM needs to update wavefront many times along the scan path and therefore the scanning rate is low. Towards high-speed aberration-free continuous axial scanning, remote focusing has been developed [21]. The idea is to image the pupil of the proximal objective lens to the pupil of the distal objective with the same maximum half cone angle and thus the same defocusing wavefront profile. A galvo driven miniature mirror can be positioned near the focal plane of the proximal objective to achieve rapid (up to ∼kHz) focusing control [20,21]. Ideally, remote focusing can achieve aberration-free defocusing until the defocusing wavefront starts to challenge the field-of-view (FOV) supported by the objective lenses. However, the experimentally measured aberration-free defocusing range is often much smaller. In this work, we analyze the source of aberration and provide a simple solution to increase the axial scan range.

2. Methods

The essence of remote focusing is about the perfect relay imaging of defocusing wavefront. A key question to ask is whether the common commercially available objective lenses are actually designed for such a task. For microscopy applications, the goal is to image the point emission on the sample plane to the final image plane. Geometrically, all the paths between an emission point (e.g. point 1 in Fig. 1) to its image point (point 1’) need to have identical optical path lengths. Otherwise, the focus will contain aberration. Is this path length a constant for all points within the FOV of the objective lens? For example, are the path lengths 1-1’, 2-2’, and 3-3’ all identical? For most microscopy applications (e.g. bright-field imaging, wide-field fluorescence imaging, confocal microscopy, and two-photon laser scanning microscopy), there is no need to achieve such sample-to-image path length conservation, which is also not considered in most lens designs. However, the path length conservation is precisely what we need to achieve aberration-free remote focusing.

 figure: Fig. 1.

Fig. 1. Sample-to-image path length conservation is the key to aberration free remote focusing.

Download Full Size | PDF

For simplicity, let’s consider a telecentric imaging system (Fig. 1). For a plane wave propagating along the optical axis (the central rays of points 1, 2, 3), it will be focused by the objective lens into a single point on the pupil plane. Given the telecentricity of the tube lens, this focus will be collimated by the tube lens into a plane way on the final image plane. In this way, a plane wave is relay imaged to a plane wave. Therefore the sample-to-image path lengths are conserved. In other words, the path lengths 1-1’, 2-2’, and 3-3’ are all identical for this perfect telecentric imaging system. In a remote focusing system, the defocusing wavefront on the sample plane is first imaged to the intermediate image plane and then to the sample plane of the second objective lens. For a sample-to-image path length conserved system, the defocusing wavefront is relay imaged with only an additional phase constant (the sample-to-image optical path length) and is therefore aberration free. However, for non-telecentric imaging systems, the sample-to-image path may have spatial dependence (e.g. 1-1’≠ 2-2’≠ 3-3’), which adds an additional phase profile to the relayed defocusing wavefront and causes aberration. The amount of aberration depends on the defocusing distance. For zero defocusing, the optical beam only occupies a diffraction limited spot (e.g. only point 2 in Fig. 1) on the sample plane and therefore experiences no sample-to-image path length variation. With increased defocusing, the optical beam occupies a larger area on the sample plane and therefore encounters more of the path length variation.

One way to quantify the sample-to-image path length variation and to assist the optimal alignment of a remote focusing system is to send in a plane wave to the sample plane of the objective lens and to measure the optical wavefront at the image plane of the tube lens. The tube lens position should be adjusted so that the wavefront on the image plane is collimated. The residual deviation from a plane wave becomes the source of aberration (the aforementioned sample-to-image path length variation) in remote focusing. We propose to position a wavefront corrector at the image plane to compensate for the sample-to-image path length variation and improve the range of aberration free remote focusing.

Experimentally we implemented a remote focusing system with an input NA of 0.8 (Fig. 2). We used a pair of identical objective lenses of NA 1.0 to relay the focus. For detection, we used a NA 1.1 objective lens. The laser wavelength was 633 nm. We used SLM1 to measure and correct the overall system aberration (Fig. 3(a)) only at zero defocusing. The wavefront measurement was based on the parallel phase modulation technique developed by our lab [28,3335]. Basically, we kept half of the pixels on the SLM stable while simultaneously modulating the other half of the pixels, each pixel at a unique frequency, and then retrieved the phase information through the Fourier transform of the focus intensity modulation data. The second SLM was positioned on the common imaging plane (SLM2 in Fig. 2) for sample-to-image path length correction. We performed wavefront measurement only at +280 and −280 μm defocusing (+ means moving the 40x away from the 20x). At either position, the wavefront (Figs. 3(b) and 3(c)) occupied ∼420 μm FOV on the sample plane and 11.7 mm on SLM2 (active area 12 × 16 mm2).

 figure: Fig. 2.

Fig. 2. Optical configuration for the image-plane correction based long-range remote focusing. L1-L5, telecentric lenses of focal length 100, 100, 250, 250, 200 mm, respectively; SLM 1 and 2, phase only spatial light modulators (Hamamatsu). All objectives are water dipping lenses (40x and 25x from Nikon, 20x from Olympus).

Download Full Size | PDF

 figure: Fig. 3.

Fig. 3. (a) System correction wavefront measured by SLM1. (b, c) Image-plane correction wavefront measured by SLM2 at −280 and 280 μm defocusing, respectively. (d) The phase difference between b and c.

Download Full Size | PDF

3. Results

With the system correction (Fig. 3(a)) applied, the measured focus size (x = 0.434, y = 0.472, z = 2.64 μm) at 0 defocusing was very close to the diffraction limited value (x = 0.443, y = 0.475, z = 2.16 μm) which was based on the rigorous vector computation [36] for linearly polarized incident light and a pupil filling factor of 1 for NA 0.8 (matching the intensity profile measured at the pupil of the 25x objective lens). Using either correction (Figs. 3(b) and 3(c)), we could achieve a greater high-peak-intensity defocusing range (Fig. 4(a)). For example, the defocusing range for achieving above ∼70% peak intensity (with respect to the zero defocusing value) was 153 μm without the sample-to-image path length correction. The range was extended to 423 and 338 μm with the correction measured at 280 and −280 μm defocusing applied, respectively. Similarly, the range for achieving above ∼80% peak intensity was only 123 μm without correction. The range was extended to 359 and 215 μm with the correction measured at 280 and −280 μm defocusing applied, respectively.

 figure: Fig. 4.

Fig. 4. (a) Focus peak intensity as a function of defocusing. (b) Image sharpness (the summation of pixel intensity square divided by the square of pixel intensity summation) as a function of defocusing. (c-g) Transverse and axial focus profiles measured without path length correction at −280, −160, 0, 160, and 280 μm defocusing, respectively. Scale bar: 1 μm. (h-l) Focus profile with the path length correction measured at −280 μm defocusing applied. (m-q) Focus profile with the path length correction measured at 280 μm defocusing applied.

Download Full Size | PDF

4. Discussion

The results show that the sample-to-image path length correction was indeed effective in extending the range of remote focusing system. Specifically, the range for high-quality focusing was increased by ∼300%. We noticed that the focus intensity plot (Fig. 4(a)) was not entirely flat and the correction measured at one side of the defocusing range was not fully effective on the other side. This was likely due to the imperfect optical performance or alignment of the two 20x objectives and the relay lenses (L3 and L4 in Fig. 2). Nevertheless, the achieved axial scan range was already substantial for many imaging applications [37,38].

One concern for high power laser applications is that the wavefront corrector was positioned on the image plane. For zero defocusing, it means that the laser focus will dwell on the device surface. One solution is to fabricate a freeform reflective metal surface using high-precision diamond milling to serve as the wavefront corrector. Without phase wrapping, it will also offer better achromatic performance than SLM for broadband emission sources. A similar idea is to leverage the advance of freeform glass fabrication [39] to make a transparent wavefront corrector. Both the metal reflector and the glass corrector will greatly enhance the power handling capability.

The defocusing range was measured at 633nm wavelength in this work. For applications that utilize longer wavelength (e.g. ∼930nm in two-photon laser scanning microscopy), the range will be even longer because the Strehl ratio has approximately a negative exponential dependence on the square of the optical frequency [40]. The lower refractive index and longer wavelength can inherently tolerate more imperfection.

5. Conclusion

In this work, we analyzed the requirement to achieve perfect remote focusing. The sample-to-image path length conservation is crucial to avoid aberration for relaying defocused wavefront, which can be satisfied by a perfect telecentric relay system. To compensate for the sample-to-image path length variation, we propose to position a wavefront corrector at the common image plane of the remote focusing system. Experimentally, we increased the defocusing range by ∼300% using such a method for high-quality focusing. As high-speed 3D scanning is becoming widely adopted in many disciplines, we expect that the proposed method will find a broad range of applications and inspire innovations.

Funding

National Institutes of Health (RF1MH120005, U01NS094341, U01NS107689).

Acknowledgement

M.C. acknowledges the scientific equipment from HHMI.

Disclosures

The authors declare no conflicts of interest.

References

1. W. M. Steen and J. Mazumder, Laser material processing (Springer science & business media, 2010).

2. T. Duda and L. V. Raghavan, “3D metal printing technology,” IFAC-PapersOnLine 49(29), 103–110 (2016). [CrossRef]  

3. K. D. Solomon, L. E. F. De Castro, H. P. Sandoval, J. M. Biber, B. Groat, K. D. Neff, M. S. Ying, J. W. French, E. D. Donnenfeld, and R. L. Lindstrom, “LASIK world literature review: quality of life and patient satisfaction,” Ophthalmology 116(4), 691–701 (2009). [CrossRef]  

4. X. Liu, M. J. Cobb, Y. Chen, M. B. Kimmey, and X. Li, “Rapid-scanning forward-imaging miniature endoscope for real-time optical coherence tomography,” Opt. Lett. 29(15), 1763–1765 (2004). [CrossRef]  

5. A. Rodriguez, D. Ehlenberger, K. Kelliher, M. Einstein, S. C. Henderson, J. H. Morrison, P. R. Hof, and S. L. Wearne, “Automated reconstruction of three-dimensional neuronal morphology from laser scanning microscopy images,” Methods 30(1), 94–105 (2003). [CrossRef]  

6. W. Göbel, B. M. Kampa, and F. Helmchen, “Imaging cellular network dynamics in three dimensions using fast 3D laser scanning,” Nat. Methods 4(1), 73–79 (2007). [CrossRef]  

7. J. Pawley, Handbook of Biological Confocal Microscopy, 3 ed. (Springer US, 2006).

8. W. Denk and K. Svoboda, “Photon upmanship: Why multiphoton imaging is more than a gimmick,” Neuron 18(3), 351–357 (1997). [CrossRef]  

9. D. M. Huland, K. Charan, D. G. Ouzounov, J. S. Jones, N. Nishimura, and C. Xu, “Three-photon excited fluorescence imaging of unstained tissue using a GRIN lens endoscope,” Biomed. Opt. Express 4(5), 652–658 (2013). [CrossRef]  

10. A. Zumbusch, G. R. Holtom, and X. S. Xie, “Three-dimensional vibrational imaging by coherent anti-Stokes Raman scattering,” Phys. Rev. Lett. 82(20), 4142–4145 (1999). [CrossRef]  

11. C. Weitkamp, Lidar: range-resolved optical remote sensing of the atmosphere (Springer Science & Business, (2006), Vol. 102.

12. T. A. Pologruto, B. L. Sabatini, and K. Svoboda, “ScanImage: flexible software for operating laser scanning microscopes,” Biomedical engineering online 2(1), 13 (2003). [CrossRef]  

13. H. Ren, D. Fox, P. A. Anderson, B. Wu, and S.-T. Wu, “Tunable-focus liquid lens controlled using a servo motor,” Opt. Express 14(18), 8031–8036 (2006). [CrossRef]  

14. B. F. Grewe, F. F. Voigt, M. van’t Hoff, and F. Helmchen, “Fast two-layer two-photon imaging of neuronal cell populations using an electrically tunable lens,” Biomed. Opt. Express 2(7), 2035–2046 (2011). [CrossRef]  

15. L. Kong, J. Tang, J. P. Little, Y. Yu, T. Laemmermann, C. P. Lin, R. N. Germain, and M. Cui, “Continuous volumetric imaging via an optical phase-locked ultrasound lens,” Nat. Methods 12(8), 759–762 (2015). [CrossRef]  

16. A. Mermillod-Blondin, E. McLeod, and C. B. Arnold, “High-speed varifocal imaging with a tunable acoustic gradient index of refraction lens,” Opt. Lett. 33(18), 2146–2148 (2008). [CrossRef]  

17. J.-H. Park, L. Kong, Y. Zhou, and M. Cui, “Large-field-of-view imaging by multi-pupil adaptive optics,” Nat. Methods 14(6), 581–583 (2017). [CrossRef]  

18. A. Vaziri and V. Emiliani, “Reshaping the optical dimension in optogenetics,” Curr. Opin. Neurobiol. 22(1), 128–137 (2012). [CrossRef]  

19. S. Shoham, “Optogenetics meets optical wavefront shaping,” Nat. Methods 7(10), 798–799 (2010). [CrossRef]  

20. E. J. Botcherby, C. W. Smith, M. M. Kohl, D. Debarre, M. J. Booth, R. Juskaitis, O. Paulsen, and T. Wilson, “Aberration-free three-dimensional multiphoton imaging of neuronal activity at kHz rates,” Proc. Natl. Acad. Sci. U. S. A. 109(8), 2919–2924 (2012). [CrossRef]  

21. E. J. Botcherby, R. Juškaitis, M. J. Booth, and T. Wilson, “An optical technique for remote focusing in microscopy,” Opt. Commun. 281(4), 880–887 (2008). [CrossRef]  

22. W. Yang, J.-e. K. Miller, L. Carrillo-Reid, E. Pnevmatikakis, L. Paninski, R. Yuste, and D. S. Peterka, “Simultaneous multi-plane imaging of neural circuits,” Neuron 89(2), 269–284 (2016). [CrossRef]  

23. S. Quirin, D. S. Peterka, and R. Yuste, “Instantaneous three-dimensional sensing using spatial light modulator illumination with extended depth of field imaging,” Opt. Express 21(13), 16007–16021 (2013). [CrossRef]  

24. M. Dal Maschio, A. M. De Stasi, F. Benfenati, and T. Fellin, “Three-dimensional in vivo scanning microscopy with inertia-free focus control,” Opt. Lett. 36(17), 3503–3505 (2011). [CrossRef]  

25. D. Debarre, M. J. Booth, and T. Wilson, “Image based adaptive optics through optimisation of low spatial frequencies,” Opt. Express 15(13), 8176–8190 (2007). [CrossRef]  

26. M. J. Booth, “Wavefront sensorless adaptive optics for large aberrations,” Opt. Lett. 32(1), 5–7 (2007). [CrossRef]  

27. M. A. A. Neil, R. Juskaitis, M. J. Booth, T. Wilson, T. Tanaka, and S. Kawata, “Adaptive aberration correction in a two-photon microscope,” J. Microsc. (Oxford, U. K.) 200(2), 105–108 (2000). [CrossRef]  

28. J. Tang, R. N. Germain, and M. Cui, “Superpenetration optical microscopy by iterative multiphoton adaptive compensation technique,” Proc. Natl. Acad. Sci. U. S. A. 109(22), 8434–8439 (2012). [CrossRef]  

29. A. R. Mardinly, I. A. Oldenburg, N. C. Pégard, S. Sridharan, E. H. Lyall, K. Chesnov, S. G. Brohawn, L. Waller, and H. Adesnik, “Precise multimodal optical control of neural ensemble activity,” Nat. Neurosci. 21(6), 881–893 (2018). [CrossRef]  

30. W. Yang, L. Carrillo-Reid, Y. Bando, D. S. Peterka, and R. Yuste, “Simultaneous two-photon imaging and two-photon optogenetics of cortical circuits in three dimensions,” eLife 7, e32671 (2018). [CrossRef]  

31. S. Sun, G. Zhang, Z. Cheng, W. Gan, and M. Cui, “Large-scale femtosecond holography for near simultaneous optogenetic neural modulation,” Opt. Express 27(22), 32228–32234 (2019). [CrossRef]  

32. S. J. Yang, W. E. Allen, I. Kauvar, A. S. Andalman, N. P. Young, C. K. Kim, J. H. Marshel, G. Wetzstein, and K. Deisseroth, “Extended field-of-view and increased-signal 3D holographic illumination with time-division multiplexing,” Opt. Express 23(25), 32573–32581 (2015). [CrossRef]  

33. R. Fiolka, K. Si, and M. Cui, “Complex wavefront corrections for deep tissue focusing using low coherence backscattered light,” Opt. Express 20(15), 16532–16543 (2012). [CrossRef]  

34. T.-W. Wu, J. Tang, B. Hajj, and M. Cui, “Phase resolved interferometric spectral modulation (PRISM) for ultrafast pulse measurement and compression,” Opt. Express 19(14), 12961–12968 (2011). [CrossRef]  

35. M. Cui, “Parallel wavefront optimization method for focusing light through random scattering media,” Opt. Lett. 36(6), 870–872 (2011). [CrossRef]  

36. B. Richards and E. Wolf, “Electromagnetic diffraction in optical systems, II. Structure of the image field in an aplanatic system,” Proc. R. Soc. Lond. A 253(1274), 358–379 (1959). [CrossRef]  

37. W. Yang and R. Yuste, “In vivo imaging of neural activity,” Nat. Methods 14(4), 349–359 (2017). [CrossRef]  

38. B. A. Wilt, L. D. Burns, E. T. W. Ho, K. K. Ghosh, E. A. Mukamel, and M. J. Schnitzer, “Advances in Light Microscopy for Neuroscience,” Annu. Rev. Neurosci. 32(1), 435–506 (2009). [CrossRef]  

39. F. Fang, X. Zhang, A. Weckenmann, G. Zhang, and C. Evans, “Manufacturing and measurement of freeform optics,” CIRP Ann. 62(2), 823–846 (2013). [CrossRef]  

40. J. W. Hardy, Adaptive optics for astronomical telescopes (Oxford University Press, 1998).

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (4)

Fig. 1.
Fig. 1. Sample-to-image path length conservation is the key to aberration free remote focusing.
Fig. 2.
Fig. 2. Optical configuration for the image-plane correction based long-range remote focusing. L1-L5, telecentric lenses of focal length 100, 100, 250, 250, 200 mm, respectively; SLM 1 and 2, phase only spatial light modulators (Hamamatsu). All objectives are water dipping lenses (40x and 25x from Nikon, 20x from Olympus).
Fig. 3.
Fig. 3. (a) System correction wavefront measured by SLM1. (b, c) Image-plane correction wavefront measured by SLM2 at −280 and 280 μm defocusing, respectively. (d) The phase difference between b and c.
Fig. 4.
Fig. 4. (a) Focus peak intensity as a function of defocusing. (b) Image sharpness (the summation of pixel intensity square divided by the square of pixel intensity summation) as a function of defocusing. (c-g) Transverse and axial focus profiles measured without path length correction at −280, −160, 0, 160, and 280 μm defocusing, respectively. Scale bar: 1 μm. (h-l) Focus profile with the path length correction measured at −280 μm defocusing applied. (m-q) Focus profile with the path length correction measured at 280 μm defocusing applied.
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.