Abstract
The aberrations in an optical microscope are commonly measured and corrected at one location in the field of view, within the so-called isoplanatic patch. Full-field correction is desirable for high-resolution imaging of large specimens. Here we present, to the best of our knowledge, a novel wavefront detector, based on pupil sampling with subapertures, measuring the aberrated wavefront phase at each position of the specimen. Based on this measurement, we propose a region-wise deconvolution that provides an anisoplanatic reconstruction of the sample image. Our results indicate that the measurement and correction of the aberrations can be performed in a wide-field fluorescence microscope over its entire field of view.
© 2021 Optical Society of America under the terms of the OSA Open Access Publishing Agreement
The measurement of wavefront aberration is of crucial interest in optical microscopy, which aims to image samples at the highest possible resolution. The wavefront measurement can be carried out with several methods, for example with the use of wavefront sensors (e.g., Shack–Hartman wavefront sensors), by the acquisition of the system’s point spread function (PSF) at different depths [1] or indirectly by using a pupil segmentation method with the use of digital micromirror device (DMD) or spatial light modulator (SML) [2,3]. In fluorescence microscopy, the PSF can be directly measured by imaging a point-like source, such as fluorescent nanobeads, smaller than the diffraction limit of the optical system. When a good knowledge of the PSF is available, deconvolution is effectively used to provide a reconstruction of the object at high resolution [4]. However, the isoplanatic patch, i.e., the area where the aberration can be considered constant, is often smaller than the field of view. This happens when the aberrations are developed on different planes in the volume between the sample and the objective [5]. In this case, the PSF is not isoplanatic and changes in different imaged regions. Wavefront sensors measure the aberrations with high accuracy only in one point. Consequently, the reconstruction of the wavefront aberration corresponding to different regions of the sample would require the presence of many bright point sources in the field of view. This technique is successfully used in astronomy [6] and in some cases in ophthalmic imaging [7]. Similarly, the direct measurement of the PSF in different locations, when possible, is affected by noise and requires the localization of the fluorescence emitters that are sparse in different regions of the sample. Therefore, the presence of an anisoplanatic PSF makes the use of deconvolution algorithms critical [8,9] since an incorrect knowledge of the PSF creates artifacts in the reconstruction.
In this Letter we first introduce a new method for wavefront measurement able to reconstruct the wavefront aberration and the PSF corresponding to all points of the field of view. Then, we present a deconvolution method that, starting from the measured PSFs, reconstructs the samples anisoplanatically in its different regions. Using this method, we show the high resolution, artifacts-free reconstruction of a sample slide in a widefield fluorescence microscope.
Our wavefront detection method is based on a spinning subpupil aberration measurement (SPAM). The measurement starts with a motorized device that moves a subaperture across the microscope’s pupil. For each position of the subapertures [Fig. 1(a)], an image of the sample is acquired with the camera of the widefield microscope (Visualization 1). By measuring the relative shift of each acquired image, the wavefront gradient in each pupil location is obtained. Figure 1(b) shows this process for an object point ${{\rm{OP}}_1}$. When imaged through a certain subaperture, its image ${{\rm{IP}}_1}$ is shifted by a quantity $\Delta {{\rm{x}}_1}$ from the nonaberrated position. Since we do not have access to the nonaberrated image, we choose as the reference the image acquired with a subaperture placed in the center of the pupil. The displacement $\Delta {{\rm{x}}_1}$ is thus proportional to the wavefront gradient in the corresponding pupil position. The shift is then measured using the sum of squared difference algorithm [10]. The wavefront reconstruction is equivalent to the Shack–Hartmann procedure (see Supplement 1). It is worth noting that for the same subaperture position, different fields on the object (${{\rm{OP}}_2}$) can suffer from different wavefront aberrations, potentially leading to a different image shift $\Delta {{\rm{x}}_2}$. Thanks to this principle the SPAM can measure the wavefront in each position of the field of view. Our wavefront measurement system is composed of a spinning wheel that, rotating, scans the pupil with an aperture of 1 mm across a pupil of 10 mm diameter. This configuration was chosen to obtain a good tradeoff between wavefront sampling and device compactness and allows the measurement of aberrations up to the 4th order of Zernike polynomials (OSA/ANSI j = 14). The scan happens on four radial distances, with the angular positions of the iris being designed to avoid overlaps [Fig. 1(a)]. The total scanned points on the wavefront are 18, and, consequently, 18 images are acquired. The camera automatically adjusts the exposure time proportionally to the ratio R of the area of the subapertures and of the whole pupil (${\rm{R}} = {{100}}$). The spinning wheel is also equipped with an aperture, as large as the pupil of the microscope, which is used for image acquisition. For a typical fluorescent sample slide, the exposure time with the fully open pupil is 10 ms, and it is 1 s for the subapertures. The total time required for the wavefront measurement is 18 s. It is worth noting that the subapertures could be realized with a bigger diameter for increased light collection during the scanning process but at the expense of wavefront measurement accuracy. The images are acquired with a ${{10}} \times$ microscope (Mitutoyo, Plan Apo, LWD, ${\rm{NA}} = {0.28}$) objective in a custom-made widefield upright microscope, equipped with a 200 mm tube lens (Nikon) and a sCMOS camera (Hamamatsu Flash 4.0, $2048 \times 2048$ pixels).
Figure 2 shows the image of a microscope slide (FluoCells, Thermofisher, Prepared Slide #3, 16 µm mouse kidney section labeled with Alexa Fluor 488). An aberrating phase plate, made with two 170 µm cover-glasses bonded with UV glue, is placed in front of the sample, inclined by 45° to induce a large aberration. A 480 nm light-emitting diode illuminates the sample, and the light is detected with a ${{520}}\;{{\pm}}\;{{15}}\;{\rm{nm}}$ filter placed in between the SPAM module and the tube lens. Thus, the SPAM reconstructs the region-dependent wavefront phases [Fig. 2(b)] in $N \times N$ regions (here $N = 9$). The corresponding PSFs are calculated as the squared value of the 2D Fourier transform of the wavefront phases [Fig. 2(c)]. We observe that the aberrations contain mainly astigmatism and coma.
Since we have access to the PSF in the different locations of the sample, we developed a region-wise deconvolution, inspired by an approach proposed in astronomy [11]. We implement the deconvolution using a Richardson–Lucy (RL) iterative method [12]. A simple deconvolution for each of the $N \times N$ regions of the image would produce artifacts at the boundary of each region. These artifacts will typically have a size comparable to the one of the PSF. Moreover, each tile abruptly changes the PSF, possibly leading to a discontinuous reconstruction. Assuming that the PSF changes over the image with continuity, we use the following approach to avoid the formation of stitching artifacts. We divide the camera image, into overlapping tiles, defining a window size of $W$ pixels [here $W = {{410}}\;{\rm{px}}$ and a step or stride] of $s = {{82}}\;{\rm{px}}$. The stride was set to be comparable with the size of the PSF images and to divide the image with an integer number. This choice produces a map of $M \times M$ tiles ($M = {{21}}$). Adopting a notation similar to that used in the field of convolutional neural networks, we rearrange the image-data in channels (each tile is a channel in this notation), forming a matrix with dimensions $C \times W \times W$. The tiled view of the original image, $T$, has a total of $C = {M^2} = {{441}}$ tiles [Fig. 3(a)]. To match the problem size, we expand the PSF map from $N \times N$ to the reach the same tiled dimension of the image $M \times M$, by oversampling it via bicubic interpolation. The oversampled PSFs are also rearranged, forming a kernel of $C \times W^\prime \times W^\prime$, where ${W^\prime}$ indicates the dimension of each PSF. In our case study, ${W^\prime} = {{93}}\;{\rm{px}}$. In this formalism, each channel of the image is blurred by the corresponding channel of the PSF map, according to
In conclusion, we have proposed a novel method to measure the wavefront aberrations by sampling the pupil of an optical imaging system, particularly of a widefield microscope. The method allows reconstructing the wavefront for the different points of the field of view. The SPAM module can be inserted in the detection path of a widefield microscope and does not require the use of external wavefront sensors. It provides PSF reconstruction, by imaging the sample under the microscope, without the need of using beads or calibration targets. In combination with the wavefront and PSF measurement, we have developed a multiregion, deconvolution method, which takes into consideration the nonisoplanatic PSF and carries out a region-dependent iterative deblurring. The aberration measurement, along with the proposed deconvolution, has shown to be effective in reconstructing fluorescence imaging samples at high resolution, avoiding artifacts otherwise present in a conventional isoplanatic deconvolution.
Funding
Air Force Research Laboratory (12789919); Office of Naval Research Global (12789919); H2020 Marie Skłodowska-Curie Actions (799230); Laserlab-Europe (871124).
Disclosures
The authors declare no conflicts of interest.
Data Availability
Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.
Supplemental document
See Supplement 1 for supporting content.
REFERENCES
1. R. K. Tyson, Introduction to Adaptive Optics (SPIE, 2000).
2. D. Milkie, E. Betzig, and N. Ji, Opt. Lett. 36, 4206 (2011). [CrossRef]
3. B. Vohnsen, A. C. Martins, S. Qaysi, and N. Sharmin, Appl. Opt. 57, E199 (2018). [CrossRef]
4. J. Mertz, H. Paudel, and T. G. Bifano, Appl. Opt. 54, 3498 (2015). [CrossRef]
5. M. Bertero and P. Boccacci, Introduction to Inverse Problems in Imaging (CRC Press, 2020).
6. R. Ragazzoni, E. Marchetti, and G. Valente, Nature 403, 54 (2000). [CrossRef]
7. J. Thaung, P. Knutsson, Z. Popovic, and M. Owner-Petersen, Opt. Express 17, 4454 (2009). [CrossRef]
8. L. Denis, E. Thiébaut, F. Soulez, J. M. Becker, and R. Mourya, Int. J. Comput. Vis. 115, 253 (2015). [CrossRef]
9. R. Turcotte, E. Sutu, C. C. Schmidt, N. J. Emptage, and M. J. Booth, Biomed. Opt. Express 11, 4759 (2020). [CrossRef]
10. M. B. Hisham, S. N. Yaakob, R. A. A. Raof, A. B. A. Nazren, and N. M. W. Embedded, in IEEE Student Conference on Research and Development (SCOReD) (IEEE, 2015), p. 100.
11. R. C. Flicker and F. J. Rigaut, J. Opt. Soc. Am. A 22, 504 (2005). [CrossRef]
12. W. H. Richardson, J. Opt. Soc. Am. A 62, 55 (1972). [CrossRef]
13. A. Paszke, S. Gross, F. Massa, A. Lerer, J. Bradbury, G. Chanan, T. Killeen, Z. Lin, N. Gimelshein, L. Antiga, A. Desmaison, A. Köpf, E. Yang, Z. DeVito, M. Raison, A. Tejani, S. Chilamkurthy, B. Steiner, L. Fang, J. Bai, and S. Chintala, in Conference on Neural Information Processing Systems (NeurIPS) (2019).
14. P. Pozzi, C. Smith, E. Carroll, D. Wilding, O. Soloviev, M. Booth, G. Vdovin, and M. Verhaegen, Opt. Express 28, 14222 (2020). [CrossRef]