Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

High-speed multi-objective Fourier ptychographic microscopy

Open Access Open Access

Abstract

The ability of a microscope to rapidly acquire wide-field, high-resolution images is limited by both the optical performance of the microscope objective and the bandwidth of the detector. The use of multiple detectors can increase electronic-acquisition bandwidth, but the use of multiple parallel objectives is problematic since phase coherence is required across the multiple apertures. We report a new synthetic-aperture microscopy technique based on Fourier ptychography, where both the illumination and image-space numerical apertures are synthesized, using a spherical array of low-power microscope objectives that focus images onto mutually incoherent detectors. Phase coherence across apertures is achieved by capturing diffracted fields during angular illumination and using ptychographic reconstruction to synthesize wide-field, high-resolution, amplitude and phase images. Compared to conventional Fourier ptychography, the use of multiple objectives reduces image acquisition times by increasing the area for sampling the diffracted field. We demonstrate the proposed scaleable architecture with a nine-objective microscope that generates an 89-megapixel, 1.1 µm resolution image nine-times faster than can be achieved with a single-objective Fourier-ptychographic microscope. New calibration procedures and reconstruction algorithms enable the use of low-cost 3D-printed components for longitudinal biological sample imaging. Our technique offers a route to high-speed, gigapixel microscopy, for example, imaging the dynamics of large numbers of cells at scales ranging from sub-micron to centimetre, with an enhanced possibility to capture rare phenomena.

Published by Optica Publishing Group under the terms of the Creative Commons Attribution 4.0 License. Further distribution of this work must maintain attribution to the author(s) and the published article's title, journal citation, and DOI.

1. Introduction

There is an unmet need to record wide-field, high-resolution microscopic images of dynamic events at high frame rates [14]. Examples include subcellular imaging in high-throughput digital pathology, and of rare and dynamic events, such as cell divisions within large in vitro cancer cell cultures. The ability of a microscope to record wide-field, high-resolution images is, however, fundamentally limited by diffraction and optical aberrations [57]. Diffraction limits the minimum resolvable feature size to $\lambda /(NA_\text {obj}+NA_\text {ill})$, where $\lambda$ is the wavelength of light, and $NA_\text {obj}+NA_\text {ill}$ is the sum of the numerical apertures of the objective and illumination. A typical high-resolution microscope with $NA_\text {obj}\sim 0.9$ offers a lateral resolution of $\sim\;{0.3}\;\mathrm{\mu}\textrm{m}$ $(\lambda = {550}\;\textrm{nm})$, but only within a commensurately small depth of field of 0.7 µm [8]. That is, a microscope that is able to resolve sub-cellular features can do so only within a thin layer that is much less than the thickness of the cell [8]. Optical aberrations of high-NA lenses further limit the field of view to typically $0.65 \times 0.65$ mm, yielding an image with a maximum space-bandwidth product (SBP) (resolution $\times$ field-of-view) of around 5-megapixels [5,6,9].

High-SBP images may be constructed in time sequence by stitching together a mosaic of images recorded while stepping the sample through the field of view of a high-resolution microscope. Although this can yield very high SBP, image acquisition is slow and high-cost, high-precision mechanical scanning is required, particularly for maintaining the sample in focus within the small depth of field. An alternative approach to increase the SBP is to scale the focal length of a high-NA objective, increasing the field of view. This approach has been used to demonstrate florescence microscopy with an instantaneous SBP of 170 megapixels for a $NA_\text {obj}$=0.35 and a resolution of 1.2 µm [10]. A particular advantage of this approach is that it enables high-SBP imaging in a single snapshot. However, considerable optical complexity is required to reduce wide-field aberrations to acceptable levels. Indeed, the rapid increase of optical aberrations with lens scale means that further SBP increases of high-NA objectives becomes a formidable challenge [11].

Fourier ptychographic microscopy (FPM) offers an alternative route to high-SBP, high-resolution imaging due to its unique ability to record high-resolution images using simple low-resolution microscope objectives [1216]. The increased resolution of FPM is achieved by time-sequential synthesis of a large $NA_\text {ill}$ from multiple low-resolution images acquired using an objective with a small $NA_\text {obj}$ and angularly scanned illumination provided by a LED array. Each low-resolution image corresponds to a different band of sample spatial frequencies. Computational fusion of these bands yields a high-bandwidth image spectrum, from which a high-SBP image can be obtained by Fourier transformation. This frequency-domain fusion is achieved using ptychographic reconstruction [1722], where information diversity is required to recover phase information lost during incoherent recording of images. Consequently, FPM enables high-SBP microscopy, but the time-sequential image acquisition results in extended acquisition times – typically minutes [12,14,15,23,24] - which is too slow for common biological applications requiring time-resolved microscopy [1,2].

Several techniques have been reported that aim to demonstrate snapshot FPM [2528]. Typically, the light diffracted by the sample is split into multiple beams and focused onto a single detector array. While these snapshot approaches offer high temporal resolution, the system SBP cannot exceed the pixel count of the sensor. Indeed, the need for redundancy in ptychography limits the SBP to about one third of the detector pixel count. The use of parallelisation to increase acquisition SBTP has been reported using an array of $96$ FPMs [16], imaging 96 cell cultures in parallel — analogous to using multiple microscopes to image different sample areas. High-resolution objectives were used to record narrow field-of-view (FOV) images, which were stitched together into a single high-SBP image. While such imaging can also be achieved with an array of conventional microscopes, FPM was used only to correct for the aberrations introduced by the simple, low-cost lenses. Lastly, high-speed phase retrieval is possible by collecting only a few images [29,30]. However, unlike FPM, such methods offer lesser resolution improvement (up to 2$\times$) and require the use of expensive high-NA, narrow-FOV optics. Thus, while high-SBP or high-speed imaging methods do exists, no approach has been demonstrated that combines both. In particular, all FPM approaches have a limited space-bandwidth-time product (SBTP) imposed by the use of a single imaging sensor [1].

Furthermore, while a high-NA objective of a conventional microscope may intercept almost half of the bandwidth of light diffracted by the sample, the low-NA objectives used in FPM intercept only a few percent of the diffracted bandwidth - the vast majority of the bandwidth and associated light power is wasted. To improve imaging speed and light collection efficiency, we report here a new microscopy concept called multi-objective Fourier-ptychographic microscopy (MOFPM). By using multiple objectives in parallel, we can capture an increased fraction of the light diffracted by the sample. By redirecting the diffracted light onto multiple mutually incoherent detectors, we are no longer limited by the pixel count of the imaging sensor, as is the case for most snapshot FPM techniques [2528]. Hence, MOFPM provides the first scaleable architecture for high-speed, high-resolution FPM with an arbitrarily high SBTP.

An example MOFPM prototype employing nine parallel objectives is shown in Fig. 1. For each tilted illumination angle, the nine objectives simultaneously intercept the optical fields diffracted by the sample, each forming distinct low-resolution images on associated detectors. The solid angle subtended by the nine objectives proportionately increases both the optical étendue and instantaneous SBP of image acquisition. More generally, for $n$ parallel objectives capturing images for each of $m$ illumination angles, a total of $n\cdot m$ low-resolution images are recorded. Achieving equivalent SBP with a single-objective FPM would also require $n \cdot m$ images, but at an expense of $n \cdot m$ illumination angles. Compared to MOFPM, a conventional FPM has an $n$-times larger image acquisition overhead for a given SBP. Notably, there are no fundamental obstacles to the implementation of a full hemispherical array of objectives with a 100% fill factor, analogous to that demonstrated in gigapixel photography [9]. Consequently, the MOFPM architecture enables the maximum possible étendue and an arbitrarily-high SBTP.

 figure: Fig. 1.

Fig. 1. In ptychographic imaging, the goal is to collect multiple images encoding various spatial frequencies of the sample. Which spatial frequencies go through the pass-band of the optical sensor and get detected depend either on the illumination angle or the position of the aperture. In multi-objective Fourier ptychography a clever combination of illumination angles and aperture positions are used to design an optical system. The addition of multiple apertures enables reduction of illuminations angles used without loss of reconstructed image quality/resolution. Order of magnitude image capture speed improvements can be achieved through parallelised image capture, enabling an experimental configuration providing near-snapshot gigapixel imaging. A picture of our nine-camera experimental prototype is shown on the left, with a CAD design on the right.

Download Full Size | PDF

Although we report the first demonstration of FPM with multiple objectives used in parallel, the geometry has common features with previous investigations into the feasibility of MOFPM [31] and in wide-field digital holography [32]. In particular, these articles report the scanning of a single objective through the diffraction pattern of the sample to enable the realisation of an increased SBP by time-sequential aggregation of data. Additionally, in conventional single-objective FPM, and in [31,32] all low-resolution images are recorded with identical imaging distortion and aberrations, which enables relatively straightforward Fourier-ptychographic aggregation of the image spectra into a single high-resolution spectrum. Hence, the use of a single objective enables considerable simplification of calibration and image recovery using conventional FPM algorithms. Lastly, in these proof-of-principle experiments, the longitudinal stability of the instrument was never an issue. However, our use of multiple mutually tilted objectives and multiple dissimilar sensors pose significant challenges for computational reconstruction and calibration algorithms, which we have addressed in this manuscript.

In MOFPM, the multiple objectives exhibit dissimilar imaging distortions and optical aberrations that vary substantially between the $n$ low-resolution images due to variations in geometry and manufacturing imperfections. To provide in-focus off-axis imaging across the field of view and to minimize off-axis aberrations, we used the Schleimpflug imaging configuration [33] for the off-axis objectives. Furthermore, the off-axis cameras in MOFPM record darkfield images only, unlike conventional single-objective FPM where both brightfield and darkfield images are present. The lack of high-signal-to-noise information in brightfield images makes aberration recovery and computational convergence extremely challenging, especially in noisy imaging conditions. With improved reconstruction algorithms presented here, we can recover the aberrations of off-axis cameras from darkfield images without the need for additional calibration data, unlike methods in [31]. We also developed new calibration and image-recovery algorithms that compensate for the dissimilar distortions and dissimilar field-dependent aberrations during Fourier-ptychographic image synthesis.

We report a practical demonstration of the MOFPM concept for a nine-objective MOFPM that is able to record a 89-megapixel, 1.1 µm-resolution image in 1 s of image acquisition (although latency in the camera readout electronics increased this time to 3 s in our implementation). To achieve identical SBP without multiplexing and multiple cameras, our setup would require 15$\times$ longer image acquisition time of 45 seconds. Such improvement is higher than the 10-fold acquisition time reduction offered by the highest SBPT FPM demonstration [1]. With the experimental design, reconstruction and calibration techniques outlined in this manuscript, we demonstrate the feasibility of scaling this new architecture to high-resolution microscopy with arbitrarily high SBP and SBTP.

In the next section, we introduce the theoretical principle of MOFPM followed by experimental quantification of resolution and demonstration of reconstructed image using a histology sample, and a time-resolved imaging of Dictyostelium cell dynamics. Detailed explanation of the automatic self-calibration and reconstruction algorithms are included in the Supplementary material.

2. Principle of multi-objective FPM

Given a thin sample with a transmission function $o(\mathbf {r})$ illuminated by a plane wave, the diffraction pattern in the Fourier plane can be expressed as $O(\mathbf {k}-\mathbf {k}_i)$ [12], where $\mathbf {r}$ and $\mathbf {k}$ denote space and spatial-frequency co-ordinates respectively. The wave vector $\mathbf {k}_i$ corresponds to the angular illumination by LED $i$, which translates the diffraction pattern with respect to the optical system. The translated sample spectrum is intercepted by the objective lens of a microscope, defined by its pupil function $P_c(\mathbf {k})$, where the subscript $c$ refers to the "camera" index of a multi-camera system, and optical aberrations in $P_c(\mathbf {k})$ are unique to each lens. In MOFPM, the frequency spectrum is intercepted by multiple cameras simultaneously (see Fig. 2) and is low-pass filtered by the aperture to produce a spectrum $O(\mathbf {k}-\mathbf {k}_i-\mathbf {k}_c)P_c(\mathbf {k})$. The wave vector $\mathbf {k}_c$ indicates the position of each camera with respect to the diffracted spectrum for the given illumination angles. Multiple frequency bands are recorded in parallel, reducing the number of time-sequential illuminations required by a factor equal to the number of cameras. It should be emphasized, that unlike other multi-camera FPM implementations [16], in MOFPM each camera images the same area of the sample. The SBP and resolution is then computationally increased by synthesis of an increased image-space NA. The low-pass filtered spectrum transmitted through each objective pupil is focused onto the corresponding sensor, yielding an intensity image for each camera and illumination angle, given by $I_{i,c}(\mathbf {r})$ (see Supplementary material S2):

$$I_{i,c}(\mathbf{r}) = \mathcal{T}_c |\mathcal{F} \left\{ {O(\mathbf{k}-\mathbf{k}_i-\mathbf{k}_c)P_c(\mathbf{k})} \right\}|^{2}.$$
The additional operator $\mathcal {T}_c$, which is not required in conventional, single-camera FP, describes coordinate transformations and image distortion due to a tilted, off-axis Scheimpflug imaging geometry, which varies from camera to camera. The Scheimpflug configuration [33,34] involves tilting sample, lens and detector planes with respect to each other, to minimise defocus and distortion effects. Residual distortions vary from camera to camera and are incorporated into the image-construction algorithm.

 figure: Fig. 2.

Fig. 2. In MOFPM, the frequency spectrum can be intercepted by multiple cameras in parallel, where each captured image will encode a unique frequency band of the sample. As a result, such image acquisition method provides scaleable image acquisition speed improvement proportional to the number of cameras. While each spectrum represents a unique frequency band of the sample, the sensors are mutually-incoherent with each other, requiring modifications of conventional FPM forward model and reconstruction methods.

Download Full Size | PDF

In single-camera FPM, the phase recovery of the constructed high-bandwidth diffraction pattern at the objective pupil-plane involves the correct phasing of all diffracted fields recorded for each illumination angle. In MOFPM, the co-phasing requirement also requires co-phasing of diffracted fields captured by the multiple cameras. With ptychographic reconstruction algorithms, we can aggregate the diffracted fields coherently in the Fourier domain from intensity-only measurements. While phase-retrieval is inherently ill-posed [20], a stable solution is possible, provided that the multiple diffracted measurements overlap in the Fourier domain. To use existing ptychographic reconstruction algorithms [18,20,21] in MOFPM, the coordinate distortion must be accounted for to remove $\mathcal {T}_c$ from the image-formation model. While this could in principle could be achieved through careful experimental calibration, we developed a robust and fully automated self-calibration strategy that removes the need for precise multi-camera alignment. We used image-registration algorithms that correct for sensor tilts (which cause perspective distortions) and field-of-view mismatches between the cameras. We also correct for LED-array and aperture/lens displacements of each camera prior to the reconstruction process with an algorithm (described in the Supplementary material S3), based on Fourier ptychographic position-misalignment method [35]. The outlined calibration algorithm enables high-quality image reconstruction using low-cost lenses, low-precision and low-stability 3D-printed components and alignment by hand. Despite the seemingly complicated experimental design, the computational correction of misalignment allows for a relatively simple experimental implementation without the need for high-precision alignment.

Following preprocessing, the MOFPM forward model can be simplified (by eliminating $\mathcal {T}_c$) to that derived in Supplementary Material S3:

$$I_{i,c}(\mathbf{r}) = |\mathcal{F} \left\{ {O(\mathbf{k}-\mathbf{k}_i-\mathbf{k}_c)P_c(\mathbf{k})} \right\}|^{2}.$$
Apart from variations in $P_c(\mathbf {k})$ between the cameras, the forward model is identical to that used in conventional FPM. Consequently, established FPM reconstruction algorithms can be modified and utilized for construction of a broad image spectrum from multiple low-resolution intensity measurements, as shown in Supplementary Material S4. The off-axis cameras typically capture dark-field images representing high-spatial frequencies of the sample, whereas the bright-field images are captured by the on-axis camera only. The lack of bright-field conditions within images and the associated lower signal-to-noise ratio degrades reconstruction convergence, especially without a priori knowledge of optical aberrations. Like in computational calibration, the central camera can act as a “guide star" for image construction by the off-axis cameras and this ensures a robust high-SBP image reconstruction without prior knowledge of optical aberrations or distortions.

Lastly, we also utilize LED-multiplexed FPM [1,2], where multiple LEDs are illuminated in parallel during capture of a single image, to provide a further improvement in speed of image acquisition. Each captured image-intensity spectrum then contains multiple overlapping frequency bands, which introduces additional challenges for convergence of computational reconstruction. Nevertheless, LED multiplexing has enabled enhanced frame rates for live-cell imaging using FPM [1]. While both MOFPM and LED-multiplexed FPM aim to improve speed of image acquisition, MOFPM achieves this through parallelization of the detection NA, while illumination multiplexing parallelises the synthesis of the illumination NA. Due to mutual orthogonality between the processes, we are able to demonstrate parallelization of both illumination (through multiplexing) and detection (though increased image-space NA) in the same image acquisition. This combined parallelization offers the fastest possible FPM data capture.

3. Experimental results

In this section, we describe experimental results obtained with our nine-camera MOFPM system. A single MOFPM frame required for ptychographic reconstruction is regarded as a complete data set recorded by nine cameras for each of the $49$ illumination angles to yield $441$ unique diffracted spectral bands. The recorded MOFPM dataset is equivalent to conventional acquisition using a single camera and $441$ illumination angles (instead of $49$), but is a factor nine faster. The image quality can be as high as conventional FPM only if the image reconstruction is not degraded during the fusing of the spectra from the nine dissimilar cameras. Thus, we employ single-camera FPM images as a gold-standard reference to evaluate MOFPM. We show that we can reduce image acquisition time from 45 s to 5 s, without the loss of resolution or reconstruction quality. We also demonstrate a further reduction in acquisition time to 3 s by use of LED-multiplexed MOFPM.

Using the calibration and reconstruction algorithms outlined in the Supplementary Material S3 and S4, the MOFPM reconstruction provides robust convergence in the presence of deviations between the ideal forward model and the experimental implementation. Deviations include chromatic aberration of the microscope objectives, spatially varying illumination intensity and spatially varying aberrations. Moreover, MOFPM does not require knowledge of optical aberrations, instead, aberrations are recovered iteratively together with the complex fields. This is especially useful for calibration of the unique aberrations of all nine cameras. Full-field reconstructions of the Lung Carcinoma sample validate the robustness of our calibration and reconstruction algorithms, which were performed without any a priori aberration knowledge.

Lastly, for imaging live cells, such as Dictyostelium described below, scattering is weak and exhibits negligible contrast under bright-field illumination. This makes reconstruction of phase-only samples much more challenging compared to cells with good amplitude contrast [1,36]. The lower signal-to-noise ratio for weakly-scattering samples also inhibits the use of LED-multiplexing, which is known to be sensitive to Poisson noise. Lastly, given the higher noise due to our low-cost sensors, we did not use LED multiplexing for the weakly scattering sample imaging.

3.1 Enhanced image resolution and space-bandwidth-time product using MOFPM

From reconstructed images of a resolution test target, we can quantitively demonstrate a nine-fold greater SBTP using MOFPM without sacrificing image quality. In Fig. 3(a) we show the raw image of a USAF test target recorded with a single-camera microscope (the central camera of our nine-camera array illuminated by $7\times 7$ LEDs simultaneously), demonstrating a spatial resolution of 8 µm. Conventional FPM, using a single camera recording for 441 illumination angles yields the reconstruction shown in Fig. 3(b), which exhibits resolution enhancement to 1.1 µm. The enhanced resolution improvement is based on the ability to resolve group $9$ element $6$, and is in agreement with theoretical calculations for an illumination wavelength of 430 nm. The total data acquisition time is 45 s. We reduced this acquisition time by a factor of nine to 5 s by using nine-fold fewer illumination angles (i.e., 49) together with a nine-camera MOFPM. The reconstructed image is shown in Fig. 3(d) and can be seen to exhibit the same resolution and image quality to the ’gold-standard’ single-camera FPM image shown in Fig. 3(a). For reference, we also show in Fig. 3(b) an image recorded in 5 s using 49 illumination angles, which exhibits the expected intermediate image resolution. LED multiplexing enables the number of captured images to be further reduced from $49$ to $29$, and a reduction in image acquisition time from 5 s to 3 s, while maintaining image quality and resolution, as can be seen by the image shown in Fig. 3(e).

 figure: Fig. 3.

Fig. 3. A USAF target was imaged for quantitative assessment of resolution and image quality using MOFPM. A conventional microscope image (a) and FPM reconstruction using $441$ LEDs (b) and using $49$ LEDs (c) illustrate the trade-off between speed of image acquisition and resolution of reconstructed image. With MOFPM (d) we achieve resolution of 1.1 µm — equal to that obtained using $441$ LEDs with conventional FPM, but with reduction to only 5 s for data capture due to the use of only $49$ LEDs and parallel data capture. LED multiplexing enables image acquisition time can to be reduced even further to <3 s while maintaining the same reconstructed image resolution (e).

Download Full Size | PDF

3.2 Wide-field histology

In this section, we demonstrate the application of MOFPM for wide-field imaging in histology. Figure 4 shows a reconstructed image of a lung carcinoma sample with a field of view of 5.63 mm $\times$ 4.71 mm=26 mm2. Given that the maximum resolving power of our microscope is 1.1 µm, this corresponds to a SBP of $88.5$-megapixels — a $40\times$ increase over the $2$-megapixel SBP of the raw image shown in (Fig. 4(b2,c2,d2)). Since SBP is determined only by the imaged FOV and the reconstructed pixel size (determined by the synthetic NA), SBP calculations are independent of the sample scattering strength [12]. Comparison of (Fig. 4(c1)) and c3 demonstrates that LED multiplexing can be used to increase acquisition speed without discernible degradation in resolution or image quality. The reduction in acquisition speed to 3 s for the $88.5$-megapixel image corresponds to $\sim 30$-megapixels per second SBP. This is limited by latency in our cameras. Removal of this latency would enable an increase in frame rate from 10 Hz to 30 Hz and a SBTP of $\sim 90$-megapixels per second. Lastly, we also show quantitative phase imaging (QPI) in Fig. 4(c4) as a result of MOFPM reconstruction, corresponding to $630nm$ illumination. This demonstrates the suitability of MOFPM for quantitative label-free digital pathology application.

 figure: Fig. 4.

Fig. 4. High-SBP $89$-megapixel MOFPM reconstruction of a lung carcinoma sample (a) with zoomed in sections (b1,c1,d1). The reconstruction quality is significantly improved compared to raw data with $2$-megapixel SBP (b2,c2,c3). We also demonstrate compatibility with LED multiplexing (c3) and the possibility of quantitative phase imaging (c4).

Download Full Size | PDF

3.3 Longitudinal imaging of cell dynamics

High frame rate is essential for live-cell imaging, especially for ptychographic imaging techniques, which cannot cope with sample motion throughout data acquisition. We show that MOFPM is sufficiently stable, even using 3D printed components, to enable high-SBTP, wide-field imaging of cell dynamics over several hours. Stability of illumination angles is necessary for accurate positioning of spatial frequencies during image fusion, while sample and sensor stability ensures the spectral content being sampled matches the theoretical model. Furthermore, high-quality imaging is maintained even when the focus varies during image capture, such as due to evaporation of cell-growth media during imaging (we did not use temperature or humidity controlled sample stages). Our self-calibration algorithms can correct for movement of all mechanical components, and the reconstruction algorithm provides digital re-focusing through recovery of defocus aberration.

We demonstrate MOFPM for capturing video sequences of the collective motion of large numbers of Dictyostelium cells. These social amoebae are a model organism used to study coordinated cell migration and cell differentiation, which — in response to starvation — aggregate and morph into large migratory slugs several millimetres in length [37]. Investigating such cell evolution is a challenge for conventional microscopy, requiring a range of lenses to be used for small-scale cell-to-cell interactions ($\sim 60-100\times$ magnification, $\text {NA}\gtrsim 0.6$) and large-scale cell migration ($\sim 4\times$ magnification, NA$\lesssim 0.1$) [3840]. With MOFPM, we are able to resolve individual cells 5-15 µm in diameter across a wide field of 26 mm2.

Reconstruction of an 84-minute time-lapse image sequence of Dictyostelium cells, extracted from a 10-hour sequence, is shown in Fig. 5. These weakly-scattering cells exhibit very low amplitude contrast, and so we show only quantitative phase reconstructions. Raw images in Fig. 5(b1-c1) obtained using illumination from a single LED exhibit high contrast, which would not be the case with illumination from an extended source (for example, when multiple LEDs are illuminated) [41]. With MOFPM it is possible to resolve individual cells and their formation into migratory slugs as can be seen in Fig. 5(b2-c2)) and the linked video sequence (see figure caption). The extended FoV of 26 mm2 enables tracking of the movement of large slugs, however, during cell aggregation, the thickness can violate the thin-sample approximation assumed in FPM, which can be overcome with multi-slice reconstruction techniques [42,43]. In summary, with our technique, we were able to successfully demonstrate algorithmic robustness and stability of our calibration algorithms over extended timeframes.

 figure: Fig. 5.

Fig. 5. (a) Full-FOV Dictyostelium cell reconstruction (phase only). (b1-c1) show low-contrast raw data captured with a single LED. With incoherent illumination, no contrast would be seen. (b2-c2) show reconstructed phase, indicating dramatically improved image contrast and resolution. A time-lapse over $84$ minutes shows individual cells undergo streaming, during which they aggregate into large slugs up to 2 mm in size. See Visualization 1, Visualization 2, Visualization 3 and Visualization 4 for a comparison between raw and reconstructed images.

Download Full Size | PDF

4. Discussion

In this section, we explain the unique aspects of MOFPM that offer unique enhancements over other high-speed ptychographic imaging techniques, while also highlighting the synergy with existing high-speed imaging that offers a facile route to further increases in SBTP.

The fastest FPM demonstration to date is LED multiplexed FPM [1,2], which was used to collect a complete FPM dataset in $1$ second. Image-acquisition time is reduced by simultaneously illuminating the sample with multiple LEDs. Consequently, images corresponding to differing passbands are superimposed at the detector and this is incorporated into the forward model for the FPM reconstruction. While an increased SBTP is achieved, there is a limit on the number of LEDs that can be illuminated in parallel, before image recovery is degraded. We demonstrate below that LED multiplexing can be successfully combined with MOFPM to provide an even greater enhancement in SBTP.

MOFPM and multiplexed FPM both involve redundant sampling of spatial-frequency bands, although for MOFPM, there is the added advantage that redundant passbands are encoded in separate images captured by different cameras. This way, the computational burden is removed because spatial frequency decomposition is performed directly by the use of multiple-cameras prior to the detection process, rather than through complex computations. Moreover, in our scalable architecture, there is no fundamental limit to the number of cameras that can be used in parallel. Most importantly, LED multiplexing does not work well with weakly scattering samples, since the consequent mixing of dark-field and bright-field images leads to the weak dark-field signals being overwhelmed by shot noise from bright-field images [1,20]. MOFPM does not suffer from this limitation, since darkfield and brightfield images are captured by different imaging sensors.

Fortunately, MOFPM and LED multiplexing are mutually complementary, as was demonstrated in Sec. 3. In fact, MOFPM is complementary with all previously reported Fourier ptychographic implementations, overcoming the limitations imposed by the use of a single sensor and/or lens. The MOFPM image-formation model that we describe should be considered as a generalisation of FPM for increasing both the illumination and detection NA, and which can be integrated into the design of various optical systems. Further speed improvement is possible however by ab initio optimisation of optical design specifically for multiplexing, as was demonstrated by data-driven approaches [44,45]. For example, a non-uniform camera arrangement would need to be optimized for a given LED-multiplexing pattern to achieve the fastest, non-redundant diffracted field sampling given the LED multiplexing constraints [2].

Lastly, the MOFPM also has promise for imaging of samples that are optically thick. When the illumination is transmitted through a thin sample, there is a one-to-one relationship between k-space vectors $\mathbf {k}_i$ and LED position $\mathbf {r}_i$. This is no longer valid for a thick sample [42,46,47] due to light refraction proportionality with the illumination angle. It has been demonstrated in aperture-scanning Fourier ptychography [46,47] that diffraction can be accurately modelled by keeping the illumination direction constant while scanning the aperture in the Fourier domain. More generally, MOFPM can be regarded as a combination of both illumination- and aperture-scanning FPM. For thin samples, conventional image-reconstruction can be used and for thick samples the multi-camera arrangement can be used to deduce the scattering geometry similar to tomographic imaging. These novel adaptations are left as future work.

5. Conclusion

Fourier ptychography has demonstrated quite emphatically how computational image construction can reconfigure the problem of wide-field high-resolution microscopy from the design and manufacture of complex high-cost optics to, instead, the computational integration of multiple band-pass images acquired with simple low-cost optics. Because FPM images can be recorded with a single objective of low SBP, the overall cost and complexity can be massively reduced. The quid pro quo however is that the time-sequential construction of high-SBP images requires long image acquisition times, which reduces the attractiveness of FPM in a wide range of applications: from high-throughput digital pathology to imaging of biological dynamics. One route to increased speed is to use lenses and detectors with higher SBP, but the higher cost of these components reduces the cost-benefit of FPM – which is fundamentally its greatest asset.

Multi-objective FPM offers a new scalable architecture for increased speed of acquisition: the SBP of the detector and objective can be optimally selected to meet system requirements and reduced cost. Zheng et al. introduced the concept of synthesising an increased illumination NA from discrete illumination angles in FPM [12]. Our use of multiple objectives achieves an equivalent increase in NA for imaging, but in parallel. MOFPM can thus be considered as a generalisation of the Fourier ptychography concept to both illumination and imaging domains that provides both enhanced SBP and enhanced SBTP.

We have demonstrated that techniques developed previously for calibration of LED illumination and a single objective in conventional FPM can be extended to calibrate also the positions and aberrations of a multi-objective array. Our prototype enables construction of a wide-field ($85$-megapixel), high-resolution (1.1 µm) image, captured in $3$ seconds. This is a dramatic improvement compared to raw images containing $2$-megapixel SBP and 8 µm resolution. However, the concept can be scaled to almost arbitrarily high SBP and SBTP.

6. Methods

6.1 Optical design

In a regular microscope, the object, lens and image planes are all mutually parallel. The use of multiple objectives means this is no longer possible, but by use of the so-called Scheimpflug configuration [33] it is possible to tilt the lens and image planes such that sharp focus is retained across an extended field. The Scheimpflug condition states that the sample, lens and detector planes must meet at a single point called the Scheimpflug intersection, as illustrated in Fig. 6. This imaging technique was designed for aerial photography to remove perspective distortions and also for corneal imaging, because in both cases either the lens or the imaging sensor is tilted with respect to the sample. Scheimpflug configuration was also suggested for off-axis FPM imaging due to minimised off-axis aberrations [13,31,34]. While it is possible to correct aberrations (e.g., coma, astigmatism and defocus) computationally, this minimisation of defocus reduces the burden on the reconstruction algorithms. Additional advantages include minimised spatially varying magnification and the ability to use a curved lens array (for multi-objective systems) which increases the maximum attainable resolution compared to a planar lens array.

 figure: Fig. 6.

Fig. 6. (a) Diagram illustrating geometrical Scheimpflug principle used. The sample-lens-detector planes must intersect at a single point to satisfy the criterion; (b) Illustration of how the sample spectrum is covered by a (b1) single, (b2) by multiple cameras with a single LED (b3) by a single camera with multiple LEDs and (b4) by multiple cameras and multiple LEDs; (c-d) Illustration of the equivalence between an angular offset of illumination and an angular offset of imaging.

Download Full Size | PDF

Our experimental Scheimpflug-based MOFPM, shown in Fig. 1, employs nine imaging sensors with a curved lens array. Lenses were located in a 3D-printed holder and detector arrays were mounted in three-axis kinematic stages capable of tip-tilt-axial adjustment. Camera holders were manufactured out of aluminium to cope with the heat generated by the cameras. Based on the diagram in Fig. 6 the following set of equations can be derived, for the Scheimpflug criteria to yield a given constant magnification between the cameras:

$$\begin{aligned} L_{sep} &= (1+M) f\sin(\theta_c) / M\\ D_{sep} &= (1+M)^{2} f\sin(\theta_c) / M\\ \theta_D &= \tan^{{-}1}\left(\frac{(1+M)\cos(\theta_c)\sin(\theta_c)}{1 - (1+M)\sin^{2}(\theta_c)}\right)\\ \theta_c &= \tan^{{-}1}\left(\frac{|\mathbf{r}_i|}{z}\right) = \tan^{{-}1}\left(\frac{|\mathbf{r}_c|}{u}\right), \end{aligned}$$
where $f$ is the focal length of the objective lenses, $M$ is the magnification of each camera (microscope) and $\theta _D$ and $\theta _c$ are the tilts of the detector of the lens respectively.

Lens tilt $\theta _c$ depends on the position of the lenses $\mathbf {r}_c$, which in turn will define the band sampled of each camera. With this prototype we aimed to demonstrate the speed enhancement of a nine-camera MOFPM compared to a single-camera FPM using 441 LEDs. Since illumination angles of the LEDs define the total frequency coverage, each camera must cover frequencies equivalent to 441/9 = 49 LEDs for which we employ an array of $7\times 7$ LEDs. Examples of the spectra covered by single and multiple cameras are illustrated in Fig. 6(b). To construct a high-speed MOFPM we select a desired frequency coverage by a given LED array and use the reciprocal relationship from Fig. 6(c-d) to compute the lens positions $\mathbf {r}_c$ (based on the desired LED positions $\mathbf {r}_i$). This defines separations between the lenses, and in turn the angles $\theta _D$ and $\theta _c$ using Eqn. 3. The experimental parameters for our final prototype are summarised in Table 1.

Tables Icon

Table 1. Experimental parameters of the proposed prototype.

6.2 Experimental setup

Each camera in our MOFPM was equipped with a 8 mm diameter, 36 mm focal-length achromatic lens (Edmund optics) and a DMM 37ux264-ML 2448$\times$2048 5-megapixel monochrome sensor, (3.45 µm pixel size capable of $38$ FPS). The trigger input for all cameras were connected to the LED arrays, such that image capture is initiated by LED illumination. Two LED arrays were used to capture raw images: $32\times 32$ Adafruit LED array and Tindie $384$-well RGB LED microplate light panel. Comparison datasets acquired using a single camera requires $21\times 21$ ($441$ total) LEDs to be used, for which the $32\times 32$ Adafruit LED array was used. However, this LED array allows the illumination of only a single LED at a time. For LED multiplexed illumination, which requires simultaneous illumination by multiple LEDs, the Tindie LED array was used. Both LED arrays provide the same light intensity, and they were positioned such that the spatial frequency overlap ($60\%$) remains the same, providing equivalent illumination conditions in both experiments. The total cost of microscope components was estimated to be approximately $\$$6000, which is significantly lower compared to commercial microscope systems used for other high-speed FPM applications [1,2]. Further applications could utilize an array of lower cost cameras, such as the ones used for the $\$$150 FPM-based microscope [14].

6.3 Image acquisition

Time-sequential acquisition of a complete dataset required for image reconstruction constitutes a single frame containing $51$ images captured for each of the nine cameras: $49$ images captured for illumination by each of $49$ LEDs, a darkframe captured without any LEDs and one brightfield image used for image registration to enable calibration of for possible microscope drift during imaging. A one-off correction of LED-position misalignment requires $\sim 9$ images to be captured for each camera once, prior to longitudinal imaging. Only the $49$ brightfield/darkfield images must be captured in quick succession, whereas the remainder can be obtained while cameras are idle (in between longitudinal frame capture). Lastly, all colour images were obtained by capturing separate frames for illumination by red, green or blue LEDs. Stacking these monochrome reconstructions yields a into a single RGB colour images.

When all nine cameras are used, the frame rate is reduced from $38$ FPS (for an isolated camera) to $10$ FPS when cameras are used in parallel. For our experiments, we connected $2-3$ cameras per USB PCIe card on a standard tabletop computer and used Python scripts for image acquisition. All images were captured at the maximum available frame-rate of 10 FPS, but upgrading of the USB PCIe cards and use of native image-acquisition coding will enable an almost four-fold increase in image acquisition speed.

6.4 LED multiplexed image acquisition

In MOFPM, each camera can be considered as a standalone conventional FPM microscope with tilted optical components. Given the equivalence between MOFPM and FPM, the LED array can be multiplexed using the same principles and constraints of conventional FPM: captured diffracted fields should not overlap in spectrum and LED illumination from within the objective NA (resulting in brightfield images) should not be mixed together with LED illumination outside the NA (resulting in darkfield images) [1,2]. In our system only the central camera can capture brightfield images whereas off-axis cameras capture only darkfield. In total, $49$ LEDs are used for data acquisition, $9$ of which result in brightfield images within the central camera. Since all of the $9$ brightfield LEDs overlap in the spectral domain, they could not be multiplexed. Hence, brightfield images were captured in time-sequence, while the remaining $40$ darkfield images were captured using either $2$-LED multiplexing or $4$-LED-multiplexing. The total number of captured images was reduced by about half: from $49$ to $29$ ($2$ LEDs in parallel) or to $19$ ($4$ LEDs in parallel), leading to reductions of capture times from 5 s of 3 s and 3 s respectively. However, reconstruction quality requirements were satisfied only for $2$-LED multiplexing, which was used for data reconstruction.

6.5 Image reconstruction

All images were reconstructed using the quasi-Newton engine [20] whose convergence was accelerated with adaptive-momentum (ADAM) [48]. We obtained a low-resolution reconstruction of the sample and the pupil from the central camera, which was used as an initial estimate for the MOFPM reconstructions. Central camera reconstructions required up to 250 iterations to recover the optical aberrations. Afterwards, up to 100 iterations per camera were done to reach convergence. Reconstructions were performed by splitting the $2448 \times 2048$ pixel image FoV into 80 segments ($256 \times 256$ pixels) to mitigate the issue of spatially varying aberrations. Reconstruction of a single FOV segment took 5 minutes using NVIDIA GeForce 1080 Ti GPU, producing a $2048 \times 2048$ pixel image. Since our microscope was finite-conjugate, the non-telecentric geometry produced a phase curvature in the sample plane. To avoid artefacts in the reconstruction, we used a phase-curvature correction method [49]. Since the reconstructed image segments in MOFPM are visually identical to those reconstructed using conventional FPM, the same stitching methods can be used to produce a single wide-field image. All image segments were blended together in ImageJ [50], to produce full-FOV images without visible discontinuities.

Funding

Engineering and Physical Sciences Research Council (EP/L016753/1).

Acknowledgements

We would like to thank Leo Carlin, Robert Insall, Peter Thomason and Nikki Paul from the Beatson institute for providing live cell samples for our experiments.

Disclosures

The authors declare no conflicts of interest.

Data Availability

We provide MOFPM reconstruction software in [51] and data upon request.

Supplemental document

See Supplement 1 for supporting content.

References

1. L. Tian, Z. Liu, L.-H. Yeh, M. Chen, J. Zhong, and L. Waller, “Computational illumination for high-speed in vitro fourier ptychographic microscopy,” Optica 2(10), 904–911 (2015). [CrossRef]  

2. L. Tian, X. Li, K. Ramchandran, and L. Waller, “Multiplexed coded illumination for Fourier Ptychography with an LED array microscope,” Biomed. Opt. Express 162, 4960–4972 (2014). [CrossRef]  

3. Y. Xiao, S. Wei, S. Xue, C. Kuang, A. Yang, M. Wei, H. Lin, and R. Zhou, “High-speed fourier ptychographic microscopy for quantitative phase imaging,” Opt. Lett. 46(19), 4785–4788 (2021). [CrossRef]  

4. Y. Han, Y. Gu, A. C. Zhang, and Y.-H. Lo, “Review: imaging technologies for flow cytometry,” Lab Chip 16(24), 4639–4647 (2016). [CrossRef]  

5. D. Mendlovic and A. W. Lohmann, “Space-bandwidth product adaptation and its application to superresolution: fundamentals,” J. Opt. Soc. Am. A 14(3), 558–562 (1997). [CrossRef]  

6. O. S. Cossairt, D. Miau, and S. K. Nayar, “Scaling law for computational imaging using spherical optics,” J. Opt. Soc. Am. A 28(12), 2540–2553 (2011). [CrossRef]  

7. R. Horstmeyer, “Computational microscopy: Turning megapixels into gigapixels,” Ph.D. thesis (2016).

8. W. E. Ortyn, D. J. Perry, V. Venkatachalam, L. Liang, B. E. Hall, K. Frost, and D. A. Basiji, “Extended depth of field imaging for high speed cell analysis,” Cytometry 71A(4), 215–231 (2007). [CrossRef]  

9. D. J. Brady, M. E. Gehm, R. A. Stack, D. L. Marks, D. S. Kittle, D. R. Golish, E. M. Vera, and S. D. Feller, “Multiscale gigapixel photography,” Nature 486(7403), 386–389 (2012). [CrossRef]  

10. J. Fan, J. Suo, J. Wu, H. Xie, Y. Shen, F. Chen, G. Wang, L. Cao, G. Jin, Q. He, T. Li, G. Luan, L. Kong, Z. Zheng, and Q. Dai, “Video-rate imaging of biological dynamics at centimetre scale and micrometre resolution,” Nat. Photonics 13(11), 809–816 (2019). [CrossRef]  

11. A. Lohmann, “Scaling laws for lens systems,” Appl. Opt. 28(23), 4996–4998 (1989). [CrossRef]  

12. G. Zheng, R. Horstmeyer, and C. Yang, “Wide-field, high-resolution fourier ptychographic microscopy,” Nat. Photonics 7(9), 739–745 (2013). [CrossRef]  

13. P. C. Konda, L. Loetgering, K. C. Zhou, S. Xu, A. R. Harvey, and R. Horstmeyer, “Fourier ptychography: current applications and future promises,” Opt. Express 28(7), 9603–9630 (2020). [CrossRef]  

14. T. Aidukas, R. Eckert, A. R. Harvey, L. Waller, and P. C. Konda, “Low-cost, sub-micron resolution, wide-field computational microscopy using opensource hardware,” Sci. Rep. 9(1), 7457 (2019). [CrossRef]  

15. S. Dong, K. Guo, P. Nanda, R. Shiradkar, and G. Zheng, “Fpscope: a field-portable high-resolution microscope using a cellphone lens,” Biomed. Opt. Express 5(10), 3305–3310 (2014). [CrossRef]  

16. A. C. S. Chan, J. Kim, A. Pan, H. Xu, D. Nojima, C. Hale, S. Wang, and C. Yang, “Parallel fourier ptychographic microscopy for high-throughput screening with 96 cameras (96 eyes),” Sci. Rep. 9(1), 11114 (2019). [CrossRef]  

17. J. M. Rodenburg and H. M. Faulkner, “A phase retrieval algorithm for shifting illumination,” Appl. Phys. Lett. 85(20), 4795–4797 (2004). [CrossRef]  

18. A. M. Maiden and J. M. Rodenburg, “An improved ptychographical phase retrieval algorithm for diffractive imaging,” Ultramicroscopy 109(10), 1256–1262 (2009). [CrossRef]  

19. P. Thibault and M. Guizar-Sicairos, “Maximum-likelihood refinement for coherent diffractive imaging,” New J. Phys. 14(6), 063004 (2012). [CrossRef]  

20. L.-H. Yeh, J. Dong, J. Zhong, L. Tian, M. Chen, G. Tang, M. Soltanolkotabi, and L. Waller, “Experimental robustness of fourier ptychography phase retrieval algorithms,” Opt. Express 23(26), 33214–33240 (2015). [CrossRef]  

21. A. Maiden, D. Johnson, and P. Li, “Further improvements to the ptychographical iterative engine,” Optica 4(7), 736–745 (2017). [CrossRef]  

22. M. Odstrcil, A. Menzel, and M. Guizar-Sicairos, “Iterative least-squares solver for generalized maximum-likelihood ptychography,” Opt. Express 26(3), 3108–3123 (2018). [CrossRef]  

23. H. Zhang, S. Jiang, J. Liao, J. Deng, J. Liu, Y. Zhang, and G. Zheng, “Near-field fourier ptychography: super-resolution phase retrieval via speckle illumination,” Opt. Express 27(5), 7498–7512 (2019). [CrossRef]  

24. G. Zhou, S. Zhang, Y. Hu, and Q. Hao, “Adaptive high-dynamic-range fourier ptychography microscopy data acquisition with a red-green-blue camera,” Opt. Lett. 45(17), 4956–4959 (2020). [CrossRef]  

25. X. He, C. Liu, and J. Zhu, “Single-shot fourier ptychography based on diffractive beam splitting,” Opt. Lett. 43(2), 214–217 (2018). [CrossRef]  

26. X. He, C. Liu, and J. Zhu, “Single-shot aperture-scanning fourier ptychography,” Opt. Express 26(22), 28187–28196 (2018). [CrossRef]  

27. J. Sun, Q. Chen, J. Zhang, Y. Fan, and C. Zuo, “Single-shot quantitative phase microscopy based on color-multiplexed fourier ptychography,” Opt. Lett. 43(14), 3365–3368 (2018). [CrossRef]  

28. B. Lee, J.-y. Hong, D. Yoo, J. Cho, Y. Jeong, S. Moon, and B. Lee, “Single-shot phase retrieval via fourier ptychographic microscopy,” Optica 5(8), 976–983 (2018). [CrossRef]  

29. M. Chen, Z. F. Phillips, and L. Waller, “Quantitative differential phase contrast (dpc) microscopy with computational aberration correction,” Opt. Express 26(25), 32888–32899 (2018). [CrossRef]  

30. Y. Li, C. Shen, J. Tan, X. Wen, M. Sun, G. Huang, S. Liu, and Z. Liu, “Fast quantitative phase imaging based on kramers-kronig relations in space domain,” Opt. Express 29(25), 41067–41080 (2021). [CrossRef]  

31. P. C. Konda, J. M. Taylor, and A. R. Harvey, “Multi-aperture fourier ptychographic microscopy, theory and validation,” Opt. Lasers Eng. 138, 106410 (2021). [CrossRef]  

32. J. R. Fienup, “Synthetic-aperture direct-detection coherent imaging,” in Unconventional and Indirect Imaging, Image Reconstruction, and Wavefront Sensing 2017, (2017), September 2017, p. 9.

33. A. K. Prasad and K. Jensen, “Scheimpflug stereocamera for particle image velocimetry in liquid flows,” Appl. Opt. 34(30), 7092–7099 (1995). [CrossRef]  

34. P. C. Konda, J. M. Taylor, and A. R. Harvey, “Scheimpflug multi-aperture Fourier ptychography: coherent computational microscope with gigapixels/s data acquisition rates using 3D printed components,” in High-Speed Biomedical Imaging and Spectroscopy: Toward Big Data Instrumentation and Management II, vol. 10076K. K. Tsia and K. Goda, eds. (International Society for Optics and Photonics, 2017), p. 100760R.

35. R. Eckert, Z. F. Phillips, and L. Waller, “Efficient illumination angle self-calibration in fourier ptychography,” Appl. Opt. 57(19), 5434–5442 (2018). [CrossRef]  

36. M. Dierolf, P. Thibault, A. Menzel, C. M. Kewish, K. Jefimovs, I. Schlichting, K. v. König, O. Bunk, and F. Pfeiffer, “Ptychographic coherent diffractive imaging of weakly scattering specimens,” New J. Phys. 12(3), 035017 (2010). [CrossRef]  

37. C. Scott, P. Schaap, and M. Schaechter, “Dictyostelium,” in Encyclopedia of Microbiology (Third Edition), (Academic Press, Oxford, 2009), pp. 606–616.

38. H. Hashimura, Y. V. Morimoto, M. Yasui, and M. Ueda, “Collective cell migration of dictyostelium without camp oscillations at multicellular stages,” Commun. Biol. 2(1), 34 (2019). [CrossRef]  

39. J.-P. Levraud, M. Adam, M.-F. Luciani, C. de Chastellier, R. L. Blanton, and P. Golstein, “Dictyostelium cell death : early emergence and demise of highly polarized paddle cells,” J. Cell Biol. 160(7), 1105–1114 (2003). [CrossRef]  

40. S. L. Moores, J. H. Sabry, and J. A. Spudich, “Myosin dynamics in live dictyostelium cells,” Proc. Natl. Acad. Sci. U. S. A. 93(1), 443–446 (1996). [CrossRef]  

41. Z. Liu, L. Tian, S. Liu, and L. Waller, “Real-time brightfield, darkfield, and phase contrast imaging in a light-emitting diode array microscope,” J. Biomed. Opt. 19(10), 1 (2014). [CrossRef]  

42. R. Horstmeyer, J. Chung, X. Ou, G. Zheng, and C. Yang, “Diffraction tomography with fourier ptychography,” Optica 3(8), 827–835 (2016). [CrossRef]  

43. P. Li and A. Maiden, “Multi-slice ptychographic tomography,” Sci. Rep. 8(1), 2049 (2018). [CrossRef]  

44. M. Kellman, E. Bostan, M. Chen, and L. Waller, “Data-driven design for fourier ptychographic microscopy,” in 2019 IEEE International Conference on Computational Photography (ICCP), (IEEE, 2019), pp. 1–8.

45. B. Lin, J. Zhao, G. Cui, P. Zhang, and X. Wu, “Efficient multiplexed illumination and imaging approach for fourier ptychographic microscopy,” J. Opt. Soc. Am. A 39(5), 883–896 (2022). [CrossRef]  

46. S. Dong, R. Horstmeyer, R. Shiradkar, K. Guo, X. Ou, Z. Bian, H. Xin, and G. Zheng, “Aperture-scanning fourier ptychography for 3d refocusing and super-resolution macroscopic imaging,” Opt. Express 22(11), 13586–13599 (2014). [CrossRef]  

47. X. Ou, J. Chung, R. Horstmeyer, and C. Yang, “Aperture scanning fourier ptychographic microscopy,” Biomed. Opt. Express 7(8), 3140–3150 (2016). [CrossRef]  

48. T. Aidukas, “Next generation fourier ptychographic microscopy: computational and experimental techniques,” Ph.D. thesis, University of Glasgow (2021).

49. T. Aidukas, L. Loetgering, and A. R. Harvey, “Addressing phase-curvature in fourier ptychography,” Opt. Express 30(13), 22421–22434 (2022). [CrossRef]  

50. S. Preibisch, S. Saalfeld, and P. Tomancak, “Globally optimal stitching of tiled 3d microscopic image acquisitions,” Bioinformatics 25(11), 1463–1465 (2009). [CrossRef]  

51. T. Aidukas, “Phase curvature correction in fpm,” https://figshare.com/articles/software/Phase_curvature_correction_in_FPM/19137611 (2022).

Supplementary Material (5)

NameDescription
Supplement 1       Theory and computational methods.
Visualization 1       Longitudinal image sequence of the recorded raw data.
Visualization 2       Longitudinal image sequence of the recorded raw data (zoomed in).
Visualization 3       Reconstructed longitudinal image sequence (zoomed in).
Visualization 4       Reconstructed longitudinal image sequence.

Data Availability

We provide MOFPM reconstruction software in [51] and data upon request.

51. T. Aidukas, “Phase curvature correction in fpm,” https://figshare.com/articles/software/Phase_curvature_correction_in_FPM/19137611 (2022).

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (6)

Fig. 1.
Fig. 1. In ptychographic imaging, the goal is to collect multiple images encoding various spatial frequencies of the sample. Which spatial frequencies go through the pass-band of the optical sensor and get detected depend either on the illumination angle or the position of the aperture. In multi-objective Fourier ptychography a clever combination of illumination angles and aperture positions are used to design an optical system. The addition of multiple apertures enables reduction of illuminations angles used without loss of reconstructed image quality/resolution. Order of magnitude image capture speed improvements can be achieved through parallelised image capture, enabling an experimental configuration providing near-snapshot gigapixel imaging. A picture of our nine-camera experimental prototype is shown on the left, with a CAD design on the right.
Fig. 2.
Fig. 2. In MOFPM, the frequency spectrum can be intercepted by multiple cameras in parallel, where each captured image will encode a unique frequency band of the sample. As a result, such image acquisition method provides scaleable image acquisition speed improvement proportional to the number of cameras. While each spectrum represents a unique frequency band of the sample, the sensors are mutually-incoherent with each other, requiring modifications of conventional FPM forward model and reconstruction methods.
Fig. 3.
Fig. 3. A USAF target was imaged for quantitative assessment of resolution and image quality using MOFPM. A conventional microscope image (a) and FPM reconstruction using $441$ LEDs (b) and using $49$ LEDs (c) illustrate the trade-off between speed of image acquisition and resolution of reconstructed image. With MOFPM (d) we achieve resolution of 1.1 µm — equal to that obtained using $441$ LEDs with conventional FPM, but with reduction to only 5 s for data capture due to the use of only $49$ LEDs and parallel data capture. LED multiplexing enables image acquisition time can to be reduced even further to <3 s while maintaining the same reconstructed image resolution (e).
Fig. 4.
Fig. 4. High-SBP $89$-megapixel MOFPM reconstruction of a lung carcinoma sample (a) with zoomed in sections (b1,c1,d1). The reconstruction quality is significantly improved compared to raw data with $2$-megapixel SBP (b2,c2,c3). We also demonstrate compatibility with LED multiplexing (c3) and the possibility of quantitative phase imaging (c4).
Fig. 5.
Fig. 5. (a) Full-FOV Dictyostelium cell reconstruction (phase only). (b1-c1) show low-contrast raw data captured with a single LED. With incoherent illumination, no contrast would be seen. (b2-c2) show reconstructed phase, indicating dramatically improved image contrast and resolution. A time-lapse over $84$ minutes shows individual cells undergo streaming, during which they aggregate into large slugs up to 2 mm in size. See Visualization 1, Visualization 2, Visualization 3 and Visualization 4 for a comparison between raw and reconstructed images.
Fig. 6.
Fig. 6. (a) Diagram illustrating geometrical Scheimpflug principle used. The sample-lens-detector planes must intersect at a single point to satisfy the criterion; (b) Illustration of how the sample spectrum is covered by a (b1) single, (b2) by multiple cameras with a single LED (b3) by a single camera with multiple LEDs and (b4) by multiple cameras and multiple LEDs; (c-d) Illustration of the equivalence between an angular offset of illumination and an angular offset of imaging.

Tables (1)

Tables Icon

Table 1. Experimental parameters of the proposed prototype.

Equations (3)

Equations on this page are rendered with MathJax. Learn more.

I i , c ( r ) = T c | F { O ( k k i k c ) P c ( k ) } | 2 .
I i , c ( r ) = | F { O ( k k i k c ) P c ( k ) } | 2 .
L s e p = ( 1 + M ) f sin ( θ c ) / M D s e p = ( 1 + M ) 2 f sin ( θ c ) / M θ D = tan 1 ( ( 1 + M ) cos ( θ c ) sin ( θ c ) 1 ( 1 + M ) sin 2 ( θ c ) ) θ c = tan 1 ( | r i | z ) = tan 1 ( | r c | u ) ,
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.