Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Three-stage full-wave simulation architecture for in-depth analysis of microspheres in microscopy

Open Access Open Access

Abstract

Over a decade, considerable development has been achieved in microsphere microscopy; the popularity of this method is attributable to its compatibility with biomedical applications. Although microscopy has been used extensively, insufficient analyses and simulation approaches capable of explaining the experimental observations have hampered its theoretical development. In this paper, a three-stage full-wave simulation architecture has been presented for the in-depth analysis of the imaging properties of microspheres. This simulation architecture consists of forward and backward propagation mechanisms, following the concept of geometric optics and strictly complying to wave optics at each stage. Three numerical simulation methods, including FDTD, NTFF, and ASPW, are integrated into this simulation architecture to encompass near-field and far-field behaviors and relieve the computational burden. We validated this architecture by comparing our simulation results with the experimental data provided in literature. The results confirmed that the proposed architecture exhibits high consistency both qualitatively and quantitatively. By using this architecture, we demonstrated the near-field effect of the samples on the resolution and provided evidence to explain the conflicts in literature. Moreover, the flexibility and versatility of the proposed architecture in modeling allow adaptation to various scenarios in microsphere microscopy. The results of this study, as an imaging analysis and system design platform, may facilitate the development of microsphere microscopy for biomedical imaging, wafer inspection, and other potential applications.

© 2020 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

1. Introduction

Microsphere microscopy is an emerging research field in optical microscopy, which has gained prominence in the last decade after the discovery of photonic nanojets (PNJs) [15], which is an anomalous near-field enhancement of light focusing. In contrast to other far-field super-resolution techniques, this method provides a simple, inexpensive, and fluorescence-free super-resolution imaging configuration by introducing a microsphere or a microsolid immersion lens (mSIL) on top or in the proximity of samples in a conventional, diffraction-limited optical microscope system. Magnified virtual images are formed in the near field first and then transmitted to the far field using an optical microscope. Several studies have demonstrated that this method can achieve a resolution beyond the diffraction limit [614]. Microsphere microscopy can be combined with scanning configuration to realize large-area super-resolution imaging [1518]. With its noninvasive, fluorescence-free, and high-acquisition-rate properties, microsphere microscopy can be used in biomedical applications, particularly real-time inspection of live organisms.

Although extensive experiments have been performed to use the super-resolution capability of microsphere microscopy, its working principle remains unclear and debatable. Several analyses and simulations have been performed in the last few years to establish a comprehensive model of microsphere microscopy for super-resolution imaging. However, limited direct evidence was found to thoroughly explain the observations in the experiments. Most studies have been limited in the analysis of PNJ and regarded PNJ beam waist as the merit of the resolving power based on the reciprocity principle [8,10,19,20]. However, this approach may not be reliable because the position at which PNJ occurs most often varies from the object plane in practice. PNJ formation only represents the focusing properties, which is not equivalent to the imaging properties. Insufficient correlation between focusing and imaging properties hinders quantitative resolution evaluation in simulation as well as design optimization. In addition, several factors, such as materials of samples and substrates [8,16], excitation of electromagnetic modes [21,22], and combination with other microscopy configuration [11], are involved in the mechanism of super-resolution imaging.

The major difficulty in simulation and analysis is understanding the intricate mechanism of the formation of virtual images in the microscopic scale, particularly with resolutions that are beyond the diffraction limit. Virtual images in the near field cannot be reconstructed using geometric optics; instead, wave properties should be thoroughly analyzed. Some researchers, such as Duan et al. [21] and Maslov et al. [2224], have used the concept of “backward propagation,” numerically propagating the wave backward to the image plane based on the reciprocity principle. This method can be used to effectively reconstruct virtual images, allowing analysis of imaging properties. However, the validity of the simulation results of the method cannot be accepted without comparing the experiment results in other studies. Duocastella et al. [16] combined the backward propagation with the finite-difference time domain (FDTD) method to take advantage of the versatility of the FDTD in modeling the electromagnetic wave. The predicted position of the image plane in the simulation coincided with that in the experiment, whereas the magnification slightly deviated. However, the established models are not perfect, and a comprehensive model is essential to further investigate the imaging mechanism of microspheres.

In this paper, we present a three-stage full-wave simulation architecture, which can directly visualize the virtual images generated by microspheres and evaluate their imaging properties such as the position of the image plane, magnification, and resolution, allowing advanced optimization of structural parameters and multiscale optical systems. Three numerical simulation methods are integrated, including FDTD, near-to-far-field transformation (NTFF), and angular spectrum plane wave (ASPW), which not only encompasses near-field and far-field behaviors but also relieves the computational burden. The proposed simulation architecture is verified by comparing with the experimental observations in other studies. Moreover, the proposed simulation architecture is versatile and can be adapted for various scenarios. By using the flexibility of the proposed system, we successfully demonstrated the sample-dependent resolution of microsphere microscopy, which has not been demonstrated in other simulations.

2. Simulation architecture

In geometric optics, virtual image formation can be described by extending diverging rays backward to a virtual intersection. We constructed a simulation architecture based on a similar concept but conforming to wave optics, as shown in Fig.  1. This architecture is composed of two parts, namely forward and backward propagation. Forward propagation enables the near-field light–matter interaction and allows measuring the angular spectrum of the emanating wave as much as possible by projecting the spectrum to a wide projection plane. Backward propagation propagates the collected fields backward to reconstruct virtual images by neglecting the microsphere in the near-field. Each of the stage can be further divided into subprocesses: (1) full-wave simulation, (2) near-to-far-field transformation (NTFF) for forward propagation, (3) angular spectrum of plane waves (ASPW) by using the fast Fourier transform (FFT), and (4) virtual image formation for backward propagation. With the exception of step (4), which is responsible for image analysis, the simulation of microspheres is primarily composed of three stages, from step (1) to (3). We have outlined the design in this section, and the implementation details are provided in Appendix I.

 figure: Fig. 1.

Fig. 1. Simulation architecture. (a) Forward propagation: full-wave simulation and near-to-far-field transformation (NTFF). (b) Backward propagation: angular spectrum of plane waves (ASPW) by using the fast Fourier transform (FFT), and virtual image formation.

Download Full Size | PDF

2.1 Forward propagation

2.1.1 Full-wave simulation

Full-wave simulation methods based on Maxwell’s equations are favorable for simulating versatile nanophotonic problems, including FDTD, FEM, discrete dipole approximation, method of moments, among others. In contrast to some analytical approaches, such as Mie theory [25], only appropriate to certain geometries, these full-wave simulation methods are adaptable to different scenarios with various arrangements of structures and sources. We selected FDTD as the platform for implementing the model, whereas other methods are also acceptable with proper modification.

The simplest simulation model consists of a dipole source and microsphere. A complex model can be implemented by introducing samples with various structures and materials, as well as changing the illumination configuration. Three-dimensional (3D)-FDTD simulation was performed with perfectly matched layer boundaries as close as possible to the structures to minimize the demand of computational resources. Typically, 3D-FDTD simulation region should be within 10 µm with limited computational resources. Both electric and magnetic fields are recorded on six boundaries surrounding the simulation region, which is distinct from [16] where only the electric and magnetic fields on the top boundary are recorded. Using NTFF, which has been illustrated in the next section, this model was considerably efficient in collecting the angular spectrum of emanating waves.

2.1.2 NTFF transformation

Based on the principle of equivalence, the fields recorded on the boundaries in the previous step can be converted to electric and magnetic surface current without changing the field distribution outside the boundaries using Eq. (1) [26,27]:

$$\left\{ {\begin{array}{c} {{{\mathord{\buildrel{\lower3pt\hbox{$\scriptscriptstyle\rightharpoonup$}} \over M} }_s} ={-} \hat{n} \times \mathord{\buildrel{\lower3pt\hbox{$\scriptscriptstyle\rightharpoonup$}} \over E} }\\ {{{\mathord{\buildrel{\lower3pt\hbox{$\scriptscriptstyle\rightharpoonup$}} \over J} }_s} = \hat{n} \times \mathord{\buildrel{\lower3pt\hbox{$\scriptscriptstyle\rightharpoonup$}} \over H} } \end{array}} \right.,$$
where $\mathord{\buildrel{\lower3pt\hbox{$\scriptscriptstyle\rightharpoonup$}} \over E} $ and $\mathord{\buildrel{\lower3pt\hbox{$\scriptscriptstyle\rightharpoonup$}} \over H} $ are the recorded electric and magnetic fields on the boundaries, respectively; ${\mathord{\buildrel{\lower3pt\hbox{$\scriptscriptstyle\rightharpoonup$}} \over M} _s}$ and ${\mathord{\buildrel{\lower3pt\hbox{$\scriptscriptstyle\rightharpoonup$}} \over J} _s}$ the equivalent surface magnetic and surface electric currents, respectively; and $\hat{n}$ the surface normal. The fields within the simulation region are considered zero everywhere, and thus, the structures are negligible, resulting in a homogeneous space with equivalent current sources on the fictitious boundaries. Any region of interest outside the boundaries can be obtained using the dyadic Green’s function. This is NTFF.

NTFF is crucial for the proposed simulation architecture, which balances the demand of computational resources and the accuracy of the collected angular spectrum. Because time and memory consumption of 3D full-wave simulation increases swiftly when the simulation region is scaled up, we should be economical when defining the simulation region. However, the limited simulation region, truncating the fields outside the collection plane, leads to incomplete collection of angular spectrum because of the windowing effect. NTFF mitigates this problem by projecting the fields on all boundaries of a limited region to a considerably wider projection plane, achieving high accuracy of the angular spectrum without exhausting computational resources (Fig.  2(a)). In other words, NTFF effectively expands the simulation region, albeit without ample computational resources. In practice, the position of the projection plane is maintained only a few wavelengths beyond the simulation region for enlarging the acceptance angle with a fixed width. Another advantage of using NTFF is that the orientation of the projection plane is flexible, which is particularly conducive for off-axis optical system design (Fig.  2(b)).

 figure: Fig. 2.

Fig. 2. Advantages of using NTFF. (a) Enlarging the acceptance angle of the projection plane $(\theta ^{\prime} > \theta )$. (b) Arbitrary orientation of the projection plane.

Download Full Size | PDF

2.2 Backward propagation

2.2.1 ASPW using FFT

The time reversal property, also known as reciprocity, of the electromagnetic wave, allows reversing the emanating wave from a source to backtrack the process of propagation. The virtual images generated using the near-field optical elements can be reconstructed in a similar manner. The space is considered homogeneous after NTFF, and scalar diffraction theory is valid in this case because the six components of the electromagnetic wave (${E_x}$, ${E_y}$, ${E_z}$, ${H_x}$, ${H_y}$, ${H_z}$) can be divided into six identical scalar wave equations. In the following discussion, we use the electric field for deduction and decrease the subscript of the polarization orientation to represent a scalar field.

To use scalar diffraction theory for backward propagating the wave, we first conjugated the complex field (which is called phasor) recorded on the projection plane, equivalently reversing the direction of the wave vectors:

$$\begin{array}{c} {E_i^{TR}({x, y} )= {{\bar{E}}_i}({x, y} ),} \end{array}$$
Subsequently, we expanded the reversed field by 2D Fourier transform based on ASPW [28,29]:
$$\begin{array}{c} {{A_i}({{k_x},{k_y}} )= FF\{{E_i^{TR}({x, y} )} \}= \mathop {\int\!\!\!\int }\nolimits_{ - \infty }^\infty E_i^{TR}({x, y} )\exp [{ - i({{k_x}x + {k_y}y} )} ]dxdy,} \end{array}$$
where ${A_i}({{k_x},{k_y}} )$ is the angular spectrum. Instead of using Fresnel diffraction for backward propagating the wave, which is suitable for small divergence angles, we used the transfer function for free-space propagation $T({{k_x},{k_y};\Delta z} )$:
$$\begin{array}{c} {T({{k_x},{k_y};\Delta z} )= \exp \left[ {i{{({{k^2} - k_x^2 - k_y^2} )}^{\frac{1}{2}}}\Delta z} \right],\; } \end{array}$$
where $\Delta z$ is the distance of backward propagation along the z-axis and k the wave number in free space. If the samples and microspheres are immersed in a medium with refractive index ${n_s}$, k should be replaced by $k{n_s}$. The angular spectrum ${A_f}({{k_x},{k_y}} )$ on the final plane can be obtained using Eq. (5):
$$\begin{array}{c} {{A_f}({{k_x},{k_y}} )= {A_i}({{k_x},{k_y}} )T({{k_x},{k_y};\Delta z} ),} \end{array}$$
Finally, we can obtain the fields on the final plane ${E_f}({x, y} )$ using Eq. (6):
$$\begin{array}{c} {{E_f}({x, y} )= F{F^{ - 1}}\{{{A_f}({{k_x},{k_y}} )} \}= \frac{1}{{4{\pi ^2}}}\mathop {\int\!\!\!\int }\nolimits_{ - \infty }^\infty {A_f}({{k_x},{k_y}} )\exp [{i({{k_x}x + {k_y}y} )} ]d{k_x}d{k_y},} \end{array}$$
The flow of the backward wave propagation is summarized in Fig.  3. In practice, the aforementioned computation can be performed using FFT efficiently. Because computers can only process discrete data, the sampling theorem must be followed to avoid the aliasing effect. In theory, the sampling rate (${f_s}$) on the projection plane in NTFF should not be lower than two times of the Nyquist frequency (${f_N}$):
$$\begin{array}{c} {{f_s} \ge 2{f_N} = \frac{2}{\lambda },} \end{array}$$
where $\lambda $ is the wavelength in the background material at which the projection plane locates.

 figure: Fig. 3.

Fig. 3. Flow of backward wave propagation using time reversal and Fourier transform based on ASPW. (TR: time reversal; FT: Fourier transform; IFT: inverse FT)

Download Full Size | PDF

2.2.2 Virtual image formation and resolution evaluation metrics

The imaging process can be represented using the superposition of a collection of incoherent dipoles excited on samples at various times within a long time span. The image obtained in experiments is the statistical average of those dipoles. This stochastic process is not trivial in the full-wave simulation, requiring substantial repeated trials with random location, orientation, and phase of dipoles based on the Monte Carlo method. We used an alternative method in our model to simplify the imaging process: three incoherent dipoles with orthogonal polarizations were evenly weighted to represent a point source on the object plane, and each point source was incoherent.

The Rayleigh criterion is a commonly used metric for resolution evaluation, which states that two spots are resolvable only if the intensity at the valley is less than 73.5% of the peak in spot diagrams. However, the criterion is limited by irregular spot sizes and noise in the simulation, which affects its robustness. Instead, we define the effective resolution of a microsphere as follows: the smallest resolvable feature ($\Delta l$) is approximated using the ratio of the spot size of point spread function ($d$) and magnification ($M$):

$$\begin{array}{c} {\Delta l \cong \frac{d}{M},} \end{array}$$
The difference between Eq. (8) and the resolution definition in the far-field microscopy where the magnification does not affect the resolution is crucial. We selected the full-width at half-maximum (FWHM) to represent spot size.

3. Validation and discussion

To validate the robustness of our simulation architecture, we reproduced the 3D simulation of a fused silica microsphere in [16]. The schematic is displayed in Fig.  4. The FDTD simulation region is about 5 µm in each direction. Instead of using the top boundary of the simulation region to collect the emanating wave, we applied NTFF to project the emanating wave to a 40-µm-width projection plane, equivalently expanding the effective simulation region by almost one order of magnitude in both x- and y-directions. The projection plane is 7 µm away from the object plane, corresponding to approximately 140° acceptance angle and 0.94 numerical aperture. The position of the image plane (${z_f}$) is defined with respect to the object plane. The wavelength is 405 nm.

 figure: Fig. 4.

Fig. 4. Schematic of simulation of a fused silica microsphere (${X_{src}}$: lateral shift of the dipole).

Download Full Size | PDF

The images and cross-section plots of two incoherent point sources are depicted in Fig.  5. These results concur with the results in [16]. Thus, at the plane 4.5 µm below the object plane, two point sources separated by 300 nm were resolvable (Fig.  5(a-b)), whereas those separated by 250 nm were barely resolved (Fig.  5(c-d)) according to the Rayleigh criterion. When the separation distance was 200 nm, two spots on the image plane completely merged (Fig.  5(e-f)). Magnification and effective resolution at various positions are illustrated in Fig.  6. First, the magnification at ${z_f} = $ −4.5 µm is 2.8 in our simulation, which is identical with the experiment result in [16], whereas their simulation indicated that magnification was 3.1. The anticipated effective resolution was approximately 250 nm, which is consistent with their experimental results and Raleigh diffraction limit. Moreover, our results suggested that, by increasing the distance between the image plane and object plane up to 9 µm, the effective resolution converges to approximately 240 nm even though the magnification increases. The x–z cross-section plots, illustrated in Fig.  7, present elongated beams extending from −3 to −9 µm. This is in accordance with the experimental observation in [16], referred as “empty magnification”. The reliability of our simulation architecture on microspheres should be justifiable given such high consistency with the experiment.

 figure: Fig. 5.

Fig. 5. Virtual images and cross-section plots of two incoherent point sources. (a-b) ${z_f} = $ −4.5 µm, 2${X_{src}} = $ 300 nm. (c-d) ${z_f} = $ −4.5 µm, 2${X_{src}} = $ 250 nm. (e-f) ${z_f} = $ −4.5 µm, 2${X_{src}} = $ 200 nm.

Download Full Size | PDF

 figure: Fig. 6.

Fig. 6. Magnification and effective resolution with respect to the position of the image plane when the point sources are excited in the free space.

Download Full Size | PDF

 figure: Fig. 7.

Fig. 7. Cross-section plots of the backward propagated wave for the fused silica microsphere. (a) x-polarized dipole. (b) y-polarized dipole. Each row from left to right has ${X_{src}} = $ 0, 100, 125, 150, 200 nm, respectively.

Download Full Size | PDF

To manifest the effectiveness of NTFF in our simulation architecture, we calculated the spot diagrams at various positions illustrated in Fig.  8. The FDTD simulation region was increased by 2.5 times in both the x- and y-directions, resulting in a simulation region that was approximately five times larger than that in Fig.  8(a), where the standard setup of our simulation architecture was used. The fields were collected at the top boundary directly for backward propagation without NTFF. Although the simulation region used in Fig.  8(b) is five times larger than in Fig.  8(a), noise is more noticeable, which could be attributed to the windowing effect resulting from different simulation architectures being used. In Fig.  8(a), it is evident that NTFF effectively removes the undesirable noise, providing high accuracy with less computational resources and time consumption.

 figure: Fig. 8.

Fig. 8. Spot diagrams at different positions. (a) 5.0-µm simulation width with NTFF. (b) 12.6-µm simulation width without NTFF. The value in each inset is the position of plane with respect to the object plane.

Download Full Size | PDF

The inconsistency of the resolution in microsphere microscopy have been observed in several studies [8,16]. A widely acknowledged rationale is that the near-field effect caused by a variety of samples plays an essential role in microsphere microscopy. To validate this property, we introduced periodic gold nanorods on a glass substrate as the sample beneath the microsphere in FDTD simulation. Two situations are compared: one with the gold nanorods and the other without. The results are illustrated in Fig.  9. The outline of each point source with 200-nm separation was distinguishable when the gold nanorods are present in linear (Fig.  9(a)) and hexagonal arrangements (Fig.  9(c)), whereas the point sources merged and are not distinguishable when the gold nanorods were absent (Fig.  9(b) and (d)). The presence of the gold nanorods results in the shrinkage of the spot size by approximately 100–200 nm. Thus, the effective resolution predicted by our model was 200 nm, whereas the magnification remained unchanged (Fig.  10). To the best of our knowledge, this is the first time in simulation that the sample-dependent imaging property of microsphere microscopy was demonstrated. This success was attributed to the flexibility of our simulation architecture, allowing versatile configurations to be implemented and analyzed.

 figure: Fig. 9.

Fig. 9. Virtual images of incoherent point sources with linear and hexagonal pattern (a)(c) with and (b)(d) without gold nanorods.

Download Full Size | PDF

 figure: Fig. 10.

Fig. 10. Magnification and effective resolution with respect to the position of image plane when the point sources are excited with the gold nanorods.

Download Full Size | PDF

This sample-dependent imaging property is a distinctive feature of near-field microscopy, where the presence of coupling between near-field components and samples, which can alter the propagating characteristics in microsphere microscopy drastically. In other words, the near field coupling generates the evanescent wave, which is later converted to propagating wave by microsphere and sustained in the far field. The simulation results indicate that no significant near-field coupling was present in the experiment in [16], and the results were mainly determined using the “far-field” imaging properties of the microsphere. Therefore, the results disproved the super-resolution capability of microsphere microscopy that has been demonstrated in other studies [710]. We can further infer that the imaging mechanism of microspheres can be divided, at least, into two parts: near-field light–matter interaction and far-field image formation. Far-field image formation can be analyzed precisely using exciting point sources without the presence of samples. However, the perplexity of near-field light-matter interaction largely impedes the precise analysis of microsphere microscopy. In addition, the illumination configuration is critical to the resolution. These factors can be included in our simulation architecture with appropriate implementation. For instance, samples with different materials and structures can be utilized to analyze the influence on resolution caused by the near-field light-matter interaction. Also, the imaging characteristics of various geometries of near-field microlens can be quantitatively compared. Furthermore, our simulation architecture allows both coherent and incoherent illumination configuration, and other types of illumination such as oblique illumination and probe illumination are feasible as well. Immersion condition, which affects the spot size of Airy disk as well as the refractive index contrast of microspheres, is also easy to implement in our architecture. The last but not the least, the flexibility of the projection plane largely facilitates the design and analysis of off-axis optical systems, which is very useful in multi-channel system design to pick up the images generated by microspheres in different direction, one potential approach to broaden the field of view of microsphere microscopy. The three-stage design of our simulation architecture includes the interplay between light and samples, the focusing effect of microspheres, and the wave propagation characteristics in the far field, allowing in-depth analysis of the imaging mechanism of microsphere microscopy.

4. Conclusion and outlook

A three-stage, full-wave simulation architecture capable of analyzing the imaging properties of microspheres in microscopy, was presented in this study. This simulation architecture consists of forward and backward propagation, which allows direct visualization of virtual images in the near-field with subwavelength features and the near-field and far-field characteristics to be considered simultaneously. The simulation architecture presented in this study precisely predicts the phenomenon observed in literature [16], which proves the accuracy and robustness of the architecture. Moreover, this provides reliable evidence of the sample-dependent imaging property of microspheres, which has not been revealed in any previous simulations. These observations help us to study the imaging mechanism of microsphere microscopy, which is, as indicated by our simulation results, composed of near-field light–matter interactions and far-field image formations.

Three numerical simulation methods, namely FDTD, NTFF, and ASPW, have been integrated into our simulation architecture. The key component is NTFF, which bridges the forward and backward propagation and mitigates the demand of computational resources and maintains accuracy. It effectively expands our simulation region by about one order of magnitude, allowing accurate recovery of angular spectrum with modest time consumption. In addition, this three-stage architecture provides certain degree of flexibility in modeling, allowing adaptation to various scenarios such as wave propagation in oblique angles or various structures of the near-field lenses and samples. Although this simulation architecture was established primarily for addressing the simulation problem in microsphere microscopy, it can be adapted to other nanophotonic problems involving near-field light–matter interaction as well as subwavelength focusing. (e.g., metalens) [30].

Thus, we demonstrated the effect of microspheres in the proximity of samples. In practice, microspheres must be combined with other optical microscopes to accomplish super-resolution imaging. Our simulation architecture can serve as the interface between the microscopic and macroscopic domain, realizing multiscale optical system design, which is essential to optimize system performance. Because the magnified virtual images generated using microspheres can be reconstructed in our simulation architecture, they can be directly imported into ray-tracing analysis software for system-level image analysis. We believe this advanced integration of simulation methods can be beneficial for ameliorating certain problems such as limited FOV [8,9] and distortion [6,9] in microsphere microscopy and fosters its development in biomedical imaging, wafer inspection, and other potential applications.

Appendix I: Implementation of three-stage full-wave simulation architecture

FDTD and NTFF are commonly used in several commercial or open-source simulation toolkits, which mitigate the implementation difficulty of these functions. However, ASPW using FFT is unavailable in such toolkits. To integrate these three functions correctly and efficiently, some rules should be followed. We have cross-validated our simulation architecture on both Lumerical-FDTD and Meep [31] with the self-implemented ASPW function. The sample Python code of the three-stage full-wave simulation architecture is provided in Code 1 (Ref. [32]). One should install Meep, pyFFTW, and other prerequisite packages before running simulation. Subsequently, we present the implementation details of each part.

The implementation of the FDTD simulation of microsphere is depicted in Fig.  4. The microsphere and the boundaries are separated by 500 nm to avoid instability of PML caused by the near-field evanescent wave. The dipole source is placed 50 nm below the microsphere for ensuring it is excited in air. The mesh grid has 20 points per wavelength. One drawback of using the FDTD to simulate microsphere is that the excitation of whisper gallery mode [21] in the microsphere prolongs the simulation time considerably. It is difficult to satisfy the convergence criteria because the microsphere forms a good resonator with a high Q factor. Because we are only interested in the wave characteristics in the far field, the simulation can be terminated once the power emanating outward decreases to several magnitudes of order smaller than the total power, regardless of the convergence of fields in the FDTD simulation region. Thus, the simulation time in this step can be largely reduced.

The wave propagating outward was recorded on six boundaries of the simulation region. Before being converted to equivalent surface currents, the time sequence of wave was first transformed to the frequency domain using discrete Fourier transform, because NTFF should be operated in the frequency domain. The width of the projection plane in NTFF should be monitored to confirm whether all waves propagating upward to avoid windowing effect. The discrete nature of numerical methods should be addressed carefully. Because the field on the projection plane is recorded on a square grid, its angular spectrum is replicated in the spatial frequency domain. The angular spectrum can be correctly recovered only if it is band-limited, and the sampling rate is sufficiently high. Only the propagating wave in the far field is present, and therefore, the angular spectrum must be located in a circle with the radius $k = 2\pi /\lambda $ in the spatial frequency domain. To allow appropriate recovery of the angular spectrum, the sampling rate must comply to Eq. (7). The aliasing effect caused by the insufficient sampling rate is depicted in Fig.  11. In our simulation, the sampling rate was 200 nm per point, which is slightly more than two times of the Nyquist frequency. Moreover, the projection plane should be at least several wavelengths away from the boundaries to avoid artificial high spatial frequency component caused by the discretization of the surface currents on the boundaries, shown in Fig.  12. In practice, we placed the projection plane five wavelengths away from the top boundary of the FDTD simulation region.

 figure: Fig. 11.

Fig. 11. (a) Normal field distribution. (b) Aliasing effect due to insufficient sampling rate.

Download Full Size | PDF

 figure: Fig. 12.

Fig. 12. Angular spectra on different projection planes. (a) Five wavelengths away from the top boundary. (b) One wavelength away from the top boundary.

Download Full Size | PDF

Before directly performing ASPW to back-propagate the wave after NTFF, we should first up-sample the field on the projection plane to obtain images with finer resolution, which is beneficial for magnification evaluation and FWHM. We used bivariate spline approximation to interpolate the data and subsequently a low-pass filter to eliminate the high spatial frequency component introduced in the process of interpolation. The effect of low-pass filter is illustrated in Fig.  13. The images after up-sampling have the resolution 10 nm per pixel. Such image resolution ensures the error of the magnification predicted by our model within ±0.05. After the up-sampling and the low-pass filter, we followed the process flow of ASPW in Fig.  3 for six components of electromagnetic wave individually. The intensity at the image plane contributed from each dipole is superposed incoherently. The plane of focus is determined by the FWHM of the spots. In practice, we use pyFFTW, an open-source Python interface of FFT library, to perform the calculation in ASPW for backward propagation.

 figure: Fig. 13.

Fig. 13. Field intensity on the projection plane. (a) Before low-pass filter. (b) After low-pass filter.

Download Full Size | PDF

Appendix II: Super-resolution capability of microsphere microscopy

Instead of superposing incoherent dipoles to generate images under incoherent illumination, injecting a plane wave source in FDTD is an efficient approach to simulate microsphere microscopy under highly coherent illumination. By doing so, the image of an object can be visualized with a one-shot simulation. In the following, we present the super-resolution capability of microsphere microscopy by setting up a simulation with coherent illumination. The microsphere used in the simulation is identical with that in Fig.  4. A gold line grating is placed 50 nm below the microsphere. The period is 150 nm and the spacing is 75 nm. A plane wave source (405 nm) is injected from the top of the FDTD simulation region. The light reflected by the grating and imaged by the microsphere is projected to a 30-µm width projection plane with 7 µm away from the grating. The schematic is displayed in Fig.  14(a), and the resulting image is shown in Fig.  14(b). The line pattern in Fig.  14(b) is distinguishable, which indicates the microsphere can resolve objects with finest feature down to 75 nm. The width of a line pair in Fig.  14(b) is around 440 nm, corresponding to 2.9 magnification provided by the microsphere. Furthermore, the pincushion distortion can be observed, which is consistent with the experiment results in [6,9]. This result justifies the effectiveness of our simulation architecture for simulating the super-resolution capability of microsphere microscopy.

 figure: Fig. 14.

Fig. 14. (a) Schematic of simulation of microsphere microscopy under coherent illumination. (b) The image of gold line grating with 75-nm spacing.

Download Full Size | PDF

Funding

Ministry of Science and Technology, Taiwan (108-2221-E-002-159-).

Disclosures

The authors declare no conflicts of interest.

References

1. B. S. Luk’yanchuk, “Laser cleaning of solid surface: optical resonance and near-field effects,” in Proceedings of SPIE (SPIE, 2004), Vol. 4065, pp. 576–587.

2. Z. Chen, A. Taflove, and V. Backman, “Photonic nanojet enhancement of backscattering of light by nanoparticles: a potential novel visible-light ultramicroscopy technique,” Opt. Express 12(7), 1214 (2004). [CrossRef]  

3. X. Li, Z. Chen, A. Taflove, and V. Backman, “Optical analysis of nanoparticles via enhanced backscattering facilitated by 3-D photonic nanojets,” Opt. Express 13(2), 526 (2005). [CrossRef]  

4. A. Heifetz, K. Huang, A. V. Sahakian, X. Li, A. Taflove, and V. Backman, “Experimental confirmation of backscattering enhancement induced by a photonic jet,” Appl. Phys. Lett. 89(22), 221118 (2006). [CrossRef]  

5. B. S. Luk’yanchuk, R. Paniagua-Domínguez, I. Minin, O. Minin, and Z. Wang, “Refractive index less than two: photonic nanojets yesterday, today and tomorrow [Invited],” Opt. Mater. Express 7(6), 1820 (2017). [CrossRef]  

6. J. Y. Lee, B. H. Hong, W. Y. Kim, S. K. Min, Y. Kim, M. V. Jouravlev, R. Bose, K. S. Kim, I. C. Hwang, L. J. Kaufman, C. W. Wong, P. Kim, and K. S. Kim, “Near-field focusing and magnification through self-assembled nanoscale spherical lenses,” Nature 460(7254), 498–501 (2009). [CrossRef]  

7. X. Hao, C. Kuang, X. Liu, H. Zhang, and Y. Li, “Microsphere based microscope with optical super-resolution capability,” Appl. Phys. Lett. 99(20), 203102 (2011). [CrossRef]  

8. Z. Wang, W. Guo, L. Li, B. Luk’yanchuk, A. Khan, Z. Liu, Z. Chen, and M. Hong, “Optical virtual imaging at 50 nm lateral resolution with a white-light nanoscope,” Nat. Commun. 2(1), 218 (2011). [CrossRef]  

9. A. Darafsheh, G. F. Walsh, L. Dal Negro, and V. N. Astratov, “Optical super-resolution by high-index liquid-immersed microspheres,” Appl. Phys. Lett. 101(14), 141128 (2012). [CrossRef]  

10. H. Guo, Y. Han, X. Weng, Y. Zhao, G. Sui, Y. Wang, and S. Zhuang, “Near-field focusing of the dielectric microsphere with wavelength scale radius,” Opt. Express 21(2), 2434 (2013). [CrossRef]  

11. Y. Yan, L. Li, C. Feng, W. Guo, S. Lee, and M. Hong, “Microsphere-Coupled Scanning Laser Confocal Nanoscope for Sub-Diffraction-Limited Imaging at 25 nm Lateral Resolution in the Visible Spectrum,” ACS Nano 8(2), 1809–1816 (2014). [CrossRef]  

12. H. S. S. Lai, F. Wang, Y. Li, B. Jia, L. Liu, and W. J. Li, “Super-resolution real imaging in microsphere-assisted microscopy,” PLoS One 11(10), e0165194 (2016). [CrossRef]  

13. H. Zhu, W. Fan, S. Zhou, M. Chen, and L. Wu, “Polymer Colloidal Sphere-Based Hybrid Solid Immersion Lens for Optical Super-resolution Imaging,” ACS Nano 10(10), 9755–9761 (2016). [CrossRef]  

14. W. Fan, B. Yan, Z. Wang, and L. Wu, “Three-dimensional all-dielectric metamaterial solid immersion lens for subwavelength imaging at visible frequencies,” Sci. Adv. 2(8), e1600901 (2016). [CrossRef]  

15. F. Wang, L. Liu, H. Yu, Y. Wen, P. Yu, Z. Liu, Y. Wang, and W. J. Li, “Scanning superlens microscopy for non-invasive large field-of-view visible light nanoscale imaging,” Nat. Commun. 7(1), 13748 (2016). [CrossRef]  

16. M. Duocastella, F. Tantussi, A. Haddadpour, R. P. Zaccaria, A. Jacassi, G. Veronis, A. Diaspro, and F. DeAngelis, “Combination of scanning probe technology with photonic nanojets,” Sci. Rep. 7(1), 3474 (2017). [CrossRef]  

17. G. Huszka, H. Yang, and M. A. M. Gijs, “Microsphere-based super-resolution scanning optical microscope,” Opt. Express 25(13), 15079 (2017). [CrossRef]  

18. G. Huszka and M. A. M. Gijs, “Turning a normal microscope into a super-resolution instrument using a scanning microlens array,” Sci. Rep. 8(1), 601 (2018). [CrossRef]  

19. V. M. Sundaram and S.-B. Wen, “Analysis of deep sub-micron resolution in microsphere based imaging,” Appl. Phys. Lett. 105(20), 204102 (2014). [CrossRef]  

20. A. Darafsheh, “Influence of the background medium on imaging performance of microsphere-assisted super-resolution microscopy,” Opt. Lett. 42(4), 735 (2017). [CrossRef]  

21. Y. Duan, G. Barbastathis, and B. Zhang, “Classical imaging theory of a microlens with super-resolution,” Opt. Lett. 38(16), 2988 (2013). [CrossRef]  

22. A. V. Maslov and V. N. Astratov, “Imaging of sub-wavelength structures radiating coherently near microspheres,” Appl. Phys. Lett. 108(5), 051104 (2016). [CrossRef]  

23. A. V. Maslov and V. N. Astratov, “Optical nanoscopy with contact Mie-particles: Resolution analysis,” Appl. Phys. Lett. 110(26), 261107 (2017). [CrossRef]  

24. A. V. Maslov and V. N. Astratov, “Resolution and Reciprocity in Microspherical Nanoscopy: Point-Spread Function Versus Photonic Nanojets,” Phys. Rev. Appl. 11(6), 064004 (2019). [CrossRef]  

25. T. X. Hoang, Y. Duan, X. Chen, and G. Barbastathis, “Focusing and imaging in microsphere-based microscopy,” Opt. Express 23(9), 12337 (2015). [CrossRef]  

26. A. Taflove, A. Oskooi, and S. G. Johnson, Advances in FDTD Computational Electrodynamics: Photonics and Nanotechnology (Artech house, 2013).

27. J. B. Schneider, “Understanding the finite-difference time-domain method,” Sch. Electr. Eng. Comput. Sci. Washingt. State Univ.http://www.Eecs.Wsu.Edu/~schneidj/ufdtd/ (request data 29.11. 2012) (2010).

28. J. W. Goodman, Introduction to Fourier Optics (Roberts and Company Publishers, 2005).

29. A. Carbajal-Domínguez, J. B. Arroyo, J. E. Gomez Correa, and G. M. Niconoff, “Numerical calculation of near field scalar diffraction using angular spectrum of plane waves theory and FFT,” Rev. Mex. física E 56, 159–164 (2010).

30. W. T. Chen, A. Y. Zhu, M. Khorasaninejad, Z. Shi, V. Sanjeev, and F. Capasso, “Immersion Meta-Lenses at Visible Wavelengths for Nanoscale Imaging,” Nano Lett. 17(5), 3188–3194 (2017). [CrossRef]  

31. A. F. Oskooi, D. Roundy, M. Ibanescu, P. Bermel, J. D. Joannopoulos, and S. G. Johnson, “Meep: A flexible free-software package for electromagnetic simulations by the FDTD method,” Comput. Phys. Commun. 181(3), 687–702 (2010). [CrossRef]  

32. L. Y. Yu, Z. R. Cyue, and G. D. Su, “Python script for simulation of microsphere microscopy,” figshare (2020) [retrieved 25 Jan 2020], https://doi.org/10.6084/m9.figshare.11827107

Supplementary Material (1)

NameDescription
Code 1       Sample files for three-stage full-wave simulation of microsphere microscopy.

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (14)

Fig. 1.
Fig. 1. Simulation architecture. (a) Forward propagation: full-wave simulation and near-to-far-field transformation (NTFF). (b) Backward propagation: angular spectrum of plane waves (ASPW) by using the fast Fourier transform (FFT), and virtual image formation.
Fig. 2.
Fig. 2. Advantages of using NTFF. (a) Enlarging the acceptance angle of the projection plane $(\theta ^{\prime} > \theta )$ . (b) Arbitrary orientation of the projection plane.
Fig. 3.
Fig. 3. Flow of backward wave propagation using time reversal and Fourier transform based on ASPW. (TR: time reversal; FT: Fourier transform; IFT: inverse FT)
Fig. 4.
Fig. 4. Schematic of simulation of a fused silica microsphere ( ${X_{src}}$ : lateral shift of the dipole).
Fig. 5.
Fig. 5. Virtual images and cross-section plots of two incoherent point sources. (a-b) ${z_f} = $ −4.5 µm, 2 ${X_{src}} = $ 300 nm. (c-d) ${z_f} = $ −4.5 µm, 2 ${X_{src}} = $ 250 nm. (e-f) ${z_f} = $ −4.5 µm, 2 ${X_{src}} = $ 200 nm.
Fig. 6.
Fig. 6. Magnification and effective resolution with respect to the position of the image plane when the point sources are excited in the free space.
Fig. 7.
Fig. 7. Cross-section plots of the backward propagated wave for the fused silica microsphere. (a) x-polarized dipole. (b) y-polarized dipole. Each row from left to right has ${X_{src}} = $ 0, 100, 125, 150, 200 nm, respectively.
Fig. 8.
Fig. 8. Spot diagrams at different positions. (a) 5.0-µm simulation width with NTFF. (b) 12.6-µm simulation width without NTFF. The value in each inset is the position of plane with respect to the object plane.
Fig. 9.
Fig. 9. Virtual images of incoherent point sources with linear and hexagonal pattern (a)(c) with and (b)(d) without gold nanorods.
Fig. 10.
Fig. 10. Magnification and effective resolution with respect to the position of image plane when the point sources are excited with the gold nanorods.
Fig. 11.
Fig. 11. (a) Normal field distribution. (b) Aliasing effect due to insufficient sampling rate.
Fig. 12.
Fig. 12. Angular spectra on different projection planes. (a) Five wavelengths away from the top boundary. (b) One wavelength away from the top boundary.
Fig. 13.
Fig. 13. Field intensity on the projection plane. (a) Before low-pass filter. (b) After low-pass filter.
Fig. 14.
Fig. 14. (a) Schematic of simulation of microsphere microscopy under coherent illumination. (b) The image of gold line grating with 75-nm spacing.

Equations (8)

Equations on this page are rendered with MathJax. Learn more.

{ M s = n ^ × E J s = n ^ × H ,
E i T R ( x , y ) = E ¯ i ( x , y ) ,
A i ( k x , k y ) = F F { E i T R ( x , y ) } = E i T R ( x , y ) exp [ i ( k x x + k y y ) ] d x d y ,
T ( k x , k y ; Δ z ) = exp [ i ( k 2 k x 2 k y 2 ) 1 2 Δ z ] ,
A f ( k x , k y ) = A i ( k x , k y ) T ( k x , k y ; Δ z ) ,
E f ( x , y ) = F F 1 { A f ( k x , k y ) } = 1 4 π 2 A f ( k x , k y ) exp [ i ( k x x + k y y ) ] d k x d k y ,
f s 2 f N = 2 λ ,
Δ l d M ,
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.