Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Dynamic three-dimensional tracking of single fluorescent nanoparticles deep inside living tissue

Open Access Open Access

Abstract

Three-dimensional (3D) spatial information can be encoded in two-dimensional images of fluorescent nanoparticles by astigmatic imaging. We combined this method with light sheet microscopy for high contrast single particle imaging up to 200 µm deep within living tissue and real-time image analysis to determine 3D particle localizations with nanometer precision and millisecond temporal resolution. Axial information was instantly directed to the sample stage to keep a moving particle within the focal plane in an active feedback loop. We demonstrated 3D tracking of nanoparticles at an unprecedented depth throughout large cell nuclei over several thousand frames and a range of more than 10 µm in each spatial dimension, while simultaneously acquiring optically sectioned wide field images. We conclude that this 3D particle tracking technique employing light sheet microscopy presents a valuable extension to the nanoscopy toolbox.

©2012 Optical Society of America

1. Introduction

Single molecule observation has become a powerful tool for the analysis of biological processes on nanometer length and millisecond time scales. Tracking of single molecules or particles can reveal details of molecular dynamics that remain hidden in ensemble average. In vivo studies of single particle dynamics have unveiled fundamental characteristics of e.g. mRNA trafficking [1, 2] or stages of virus-host cell interaction during HIV-1 infection [3]. Early publications on single particle tracking (SPT) focused on studying the motion of proteins confined to a two dimensional (2D) membrane (see review by Saxton and Jacobson [4]), avoiding artifacts resulting from the observation of three dimensional (3D) trajectories projected to the detection plane. Nevertheless, most biomolecules or biological particles move in an inhomogeneous, anisotropic 3D environment. In recent years a number of different approaches towards 3D SPT have been developed [1, 517] extending the observable spatial parameters from 2D projections to full volumes. The axial range remained mostly limited to 1-2 µm by the depth of field of high numerical aperture (NA) objectives required for single particle detection. However, samples of interest like small organisms, multi-cellular clusters, patches of living tissue and even single adherent cells – and thus the volume potentially explored by the particles under observation – extend far beyond this range. One way to overcome the depth restriction in 3D tracking is to focus the tracking procedure to a few or only one particle of interest at a time and to adjust the imaging modalities to the observation of this specific particle in real-time, in other words, to use a feedback loop to maintain overlap between particle position and focal volume. This has been achieved by controlling the position of either the detection objective [7, 18] or the sample itself [10], successfully increasing the axial extension of individual trajectories by one order of magnitude to 10-15 µm. Temporal resolution typically depends on the (CCD) transfer time for an image for widefield methods or the piezo mirror speed in scanning systems and in the mentioned studies ranged from 0.3 ms at a field of 750 nm x 750 nm corresponding to a 5 x 5 pixel area using biplane imaging [18], over 5 ms with four point-detectors in a tetrahedral scheme [10], down to 32 ms with orbital point-scanning and simultaneous widefield imaging [7]. High-resolution images of the specimen are needed to relate motion patterns to structural elements in the vicinity. All of these techniques demonstrated 3D tracking in adherent cells at a depth of at best a few microns from the coverslip surface and were limited in penetration depth by use of epi-illumination for fluorescence excitation. This reduced contrast and precluded efficient use of the photon budget.

2. A light sheet microscope for 3D tracking

Here, we combined astigmatic detection for 3D localization with a feedback-controlled sample stage and the superior single molecule detection capabilities of optically sectioned light sheet fluorescence microscopy (LSFM) [19, 20] to track individual nanoparticles up to 200 µm deep inside living tissue. Typically, a thin sheet of 2 µm thickness around the focal plane was illuminated yielding a significantly improved contrast in comparison to epi-illumination [21]. An optical sectioning effect similar to a scanning confocal microscope was achieved by exciting almost exclusively fluorophores, which contributed to the in-focus image information, while maintaining the advantages of fast parallel data acquisition with a sensitive EMCCD camera.

Astigmatism was introduced by a weak cylindrical lens placed in front of the camera. This resulted in an elliptical shape of the point spread function (PSF) for out-of-focus particles with the major axis of the ellipsoid changing from x to y when the particle moved from above to below the focal plane. Thus the axial symmetry of the PSF was broken [6]. As soon as an image frame had been shifted from the camera chip to PC memory, our algorithm searched an intensity peak within a predefined region of interest and calculated its spread along x and y, wx,y. Comparing the results to calibration data acquired beforehand yielded an estimate of the axial position relative to the focal plane. A single calibration was valid throughout a large axial range since a water immersion objective was used to reduce aberrations [22]. Axial particle coordinates were converted to a voltage and fed back to a fast z-piezo stage before the next image frame was acquired. Thus, an active feedback loop was implemented, which displaced the sample while keeping the particle of interest in the vicinity of the focal plane.

The instrument was built around the body of a commercial inverted widefield fluorescence microscope (Eclipse Ti-U, Nikon Instruments). For fluorescence excitation, three laser lines (488 nm, Sapphire-100 and 640 nm, Cube640-40C, Coherent; 532 nm, LasNova GLK 3250 T01, Lasos) were combined through a set of dichroic mirrors and coupled into a broadband optical single-mode fiber (kineFlex, Point Source) by an acousto-optical filter (AOTF, TF525-250-6-3-GH18A, Gooch&Housego). The AOTF was externally controlled through custom-made electronics and triggered by the camera such that excitation light was only transmitted during the camera exposure time. Laser light further passed a previously described cylindrical zoom unit [23], which allowed us to match light sheet dimensions and lateral (x-y) size of the specimen, depth of focus (z) as well as the image field of the detection optics. Optionally, a galvo mirror (GVS011, Thorlabs) could be employed to rotate the light sheet by up to + - 7.5° around the center of the image plane at a frequency of 200 Hz – well above typical image acquisition rates – to overcome shadowing artifacts [24]. The excitation light sheet was formed in a custom-made cuvette [19] (Hellma) by a plan apochromat illumination objective (10x, NA 0.28, Plan Apo, Mitutoyo) as depicted in Fig. 1 .

 figure: Fig. 1

Fig. 1 (a) Schematic representation of the instrument. Illumination light was focused into the sample chamber by a 10x apochromat objective to form a thin light sheet in the focal plane (z0). The sample could be positioned coarsely by a 3-axes translation stage (X, Y, Z) and more accurately by an additional z piezo stage (Zp). Fluorescence was collected by a high NA 40x water-immersion objective, cleared up by notch and bandpass filters (NF, BF) and detected with a camera (EMCCD). A cylindrical lens (C) was used to shape the PSF for 3D localization. Fast image analysis was used to determine this information in real-time and feed it back to the z piezo stage in an active feedback loop to keep a particle near the focal plane. (b) Detailed view of the specimen within the sample chamber. Fluorescent particles located below (1) or above (3) the focal plane resulted in an elliptical PSF as observed on the camera chip (z0).

Download Full Size | PDF

Focusing the excitation light with an air objective through the 2 mm glass wall (BK7) of the sample cuvette into aqueous solutions introduced spherical aberration, which resulted in focal shifts when moving the sample chamber in x-direction. These shifts could automatically be compensated by moving the objective with a motorized translation stage coupled to the x-axis of the sample stage. Chromatic aberrations in the illumination path caused by the refractive index mismatch between air, glass wall and aqueous medium were well below the Rayleigh limit for all light sheet configurations. The sample cuvette was placed in a custom-made holder making it accessible for e.g. a micro-injection device. It was magnetically attached to a motorized 3-axes stage (3x M-112.12S, Physik Instrumente) equipped with an additional closed loop piezo z-stage for fast and accurate positioning perpendicular to the image plane (P-611.ZS, Physik Instrumente).

Fluorescence was collected through the coverslip bottom of the sample cuvette (0.17 mm) by a long working distance water immersion apochromat objective (CFI Apo LWD Lambda S 40x NA 1.15, wd 0.60 mm, Nikon). Scattered excitation light entering the detection pupil was rejected by placing appropriate notch filters (NF01-[488, 532 and 633]U-23.7-D, Semrock) and, if necessary, additional band pass emission filters into one of the filter turrets. For shaping the astigmatic detection point spread function (aPSF), a weak cylindrical lens (f = 10m, SCX-50.8-5000.0-C, CVI Melles Griot) was placed between tube lens and camera. An additional 2.5x magnifier was used to achieve an object field pixel size of 160 nm. Fluorescence was captured by an EMCCD camera (iXon DU-897, Andor Technology).

Galvo mirror, motorized components and camera were controlled by ImSpector 5.5 microscope control software (LaVision BioTec). The image analysis code for dynamic 3D tracking was implemented in terms of a DLL and could be executed by ImSpector after an image frame had been acquired. Briefly, ImSpector passed a number of parameters along with the PC memory address of the image data to the DLL. Initial coordinates as well as tracking parameters (e.g. calibration data) could be specified in an ASCII file. The DLL located the brightest pixel (x0, y0) in a region of interest around a previous particle localization, decided if a valid signal was found regarding its amplitude and spread within a 11x11 pixel area around (x0, y0) and – if so – determined a 3D localization for the particle. The axial component was converted to a voltage and issued to the piezo controller by ImSpector via the analog output of a DAQ card (ME-4660, Meilhaus) before the next image frame was acquired. The total execution time of the DLL was less than 50 µs per frame. Image data as well as tracking results and piezo voltages for each image frame were saved to physical memory. During post-processing, all particles in an image frame could be localized in 3D with nanometer precision relative to the piezo position during acquisition of the specific frame. Taking into account those positions at each time point allowed for the calculation of absolute axial coordinates. Localization precision was not significantly decreased by use of the piezo stage. The DLL was written in C + + and compiled using Microsoft Visual C + + 2010 Express (Microsoft). Post-processing particle detection and tracking, as well as data analysis and visualization were performed using MATLAB (Mathworks) and Origin (OriginLab).

3. 3D particle localization

It should be pointed out that the dynamic tracking approach was optimized not for precision but for speed, since the only goal at this stage of the experiment was to keep a particle close to the focal plane. During post-processing of the data, more accurate localization algorithms were applied (see below). Within the DLL, a fast center-of-mass method was used to estimate the aPSF parameters. This approach required rigorous background rejection to obtain best results [25]. A locally homogeneous background intensity Ibg was assumed within the 11x11 pixel subimage and subtracted from the image data. To determine Ibg, the algorithm identified the pixel with the lowest signal count Imin within the subimage. A parameter kbg was chosen empirically to minimize the number of pixels in the subimage containing background information, while at the same time preserving the signal shape. The background threshold was calculated asIbg=Imin+kbg×Imin1/2. Pixels containing negative values after background subtraction were not considered for the following calculations. The centroid (xc, yc) and spread (wx, wy) around the brightest pixel (x0, y0) were determined as

(xc,yc)=|ix0|5;|jy0|5(I(i,j)Ibg)×(i,j)/|ix0|5;|jy0|5(I(i,j)Ibg),
(wx,wy)2=|ixc|5;|jyc|5(I(i,j)Ibg)×(i,j)2/|ixc|5;|jyc|5(I(i,j)Ibg).

The ellipticity ε(zrel)=wx(zrel)/wy(zrel) of the signal was used as a fast estimator for the axial position relative to the focal plane [16], since it could be approximated by a linear fit over > 0.6 µm around the focal plane, leaving the slope b as a single parameter to characterize the axial position zrel=(εε0)/b if ε0=1 is assumed (Fig. 2(c) ). One should note that the linear regime of this estimator is not centered around the focal plane, which led to systematic localization errors for zrel < −200 nm during live tracking. Alternatively, the axial position could be obtained from the difference wx - wy [26] which can be approximated linearly over a larger axial range. During post-processing, we fitted each aPSF contained in the image data by an elliptical Gaussian I(x,y)=I0×exp((xxc)2/2wx2(yyc)2/2wy2)+Ibgand determined zrel by minimizing the cost function d(wx,wy)={[wxwx(zrel)]2+[wywy(zrel)]2}1/2 with respect to zrel. This approach was more accurate, but could take up to 1ms or longer per localization, and was therefore not suited for fast real-time applications.

 figure: Fig. 2

Fig. 2 (a) x-y-cross sections of the astigmatic PSF at different axial positions. Scale bar 1 µm. (b) The spread of the astigmatic PSF, wx (squares) and wy (circles), was determined by least squares fitting of an elliptical Gaussian and the results were fit by 3rd order polynomials (solid lines) yielding unambiguous information on the axial localization over an axial range > 1 µm. (c) For a smaller range (> 0.6 µm), a linear approximation of the ellipticity wx / wy represented a fast estimator for axial localization, which required only the slope as a parameter, since per definition the intercept ϵ(z = 0) = 1.

Download Full Size | PDF

During live tracking the algorithm was not able to distinguish between overlapping trajectories since it followed the brightest spot in a region of interest. Thus, low particle densities were chosen. Otherwise, particle misassignments would need to be resolved during post-processing as in any other tracking experiment. Overlapping PSFs were rejected by setting an upper limit for wx,y.

4. Characterization of localization precision and 3D tracking capability

To acquire a model of the aPSF and analyze the 3D localization capability of our instrument, we embedded 200 nm diameter fluorescent beads (TetraSpeck microspheres, Invitrogen) in 2% agarose within the sample cuvette. By moving the sample cuvette through the focal plane in steps of 50 nm and acquiring an image at each position, a calibration data set was obtained. Beads (n = 17) were localized in an image stack covering 82 µm x 82 µm x 100 µm and aligned in z by choosing the axial position of equal signal spread (wx(z0) = wy(z0)), to correspond to the focal plane z0. A calibration curve was determined from the z-dependency of the signal spread wx,y(zrel) (Fig. 2).

The experimental localization precision of the method was determined by moving the same sample through the focal plane, this time acquiring 100 frames per step and localizing a single bead in each of these frames. The standard deviation of the localizations for each step corresponded to the localization precision including any instrument or sample motion as well as inaccuracies in the Gaussian PSF model. Results were illustrated in Figs. 3(a) and 3(b), and yielded an average localization precision of σx = 4.5 nm, σy = 4.4 nm and σz = 19 nm at a distance of 80 nm above the focal plane and approximately 2500 detected photons per frame. The lateral localization precision agreed with theoretical expectations [27]. At a larger distance from the focal plane the increased ellipticity of the aPSF yields a non-isotropic theoretical localization precision in x and y [28, 29]. Since the goal of our work was to keep a particle of interest in focus at all times, this drawback of astigmatic PSF shaping was not of major importance for our experiments. Typically, particles were kept within + −100 nm around the focal plane. In fact, in contrast to theoretical predictions we did not observe an asymmetry in the localization precision within + −200 nm around the focal plane. This probably indicated that localization precision at high signal levels was limited not by the signal-to-noise ratio but rather the stability of the setup.

 figure: Fig. 3

Fig. 3 (a) An immobilized fluorescent bead was moved stepwise through the focal plane to determine the experimental localization precision. The bead position was determined in 100 frames per step (full black line), the mean position was marked in light gray. (b) Histograms of all localizations relative to the mean value in one plane were fit by a Gaussian function (gray) with the standard deviation σ quantifying the localization precision: σx = 4.5 nm, σy = 4.4 nm and σz = 19 nm. (c) A 100 nm nanoparticle diluted in PBS was dynamically tracked for almost 13 s (Supplementary Media 1). The x (light gray), y (dark gray) and z (black) coordinates of the bead were used to reconstruct the 3D trajectory (d, black line) with about 800 positions spanning 10 µm x 10 µm x 60 µm. 2D projections were shown to exhibit the 3D coordinates of the trajectory.

Download Full Size | PDF

In order to prove the ability to automatically track the motion of mobile particles we examined 100 nm diameter fluorescent beads (TetraSpeck microspheres, Invitrogen) in phosphate buffered saline (PBS). These beads typically sank to the bottom of the sample chamber. Calibration parameters obtained from the previous experiment were used for dynamic tracking at 16 ms temporal resolution. An exemplary bead trajectory was presented in Figs. 3(c) and 3(d). We were able to follow the particle for over 800 frames, covering an axial range of 60 µm within 12.8 s (Supplementary Media 1). The trajectory was terminated when the piezo stage reached its displacement limit.

5. Intranuclear 3D tracking

To further analyze the capabilities of the instrument we dynamically tracked individual nanoparticles more than 100 µm and up to 200 µm deep inside living tissue. Salivary glands were dissected from larvae of the dipteran Chironomus tentans. They represented a living multicellular complex with an extension of about 2000 µm x 700 µm x 250 µm. Individual salivary gland cells were 200 – 300 µm large and had nuclei with a diameter of 50 – 80 µm that contained four polytene chromosomes. Aside from the polytene chromosomes the nucleoplasm of these nuclei was devoid of chromatin. Previously, SPT experiments of protein particles containing messenger-RNA molecules (mRNPs) as well as further test molecules within these nuclei revealed mobility patterns that were more complex than pure Brownian motion [2, 30]. Due to their size the salivary gland cell nuclei presented an ideal system to demonstrate dynamic 3D tracking in an extended biological specimen. We microinjected fluorescent beads with a diameter of 20 nm (Crimson FluoSpheres, Invitrogen) into the nuclei at femtomolar final concentration. Alexa546-stained nuclear transport factor 2 (NTF2) was co-injected and served as a marker for the nuclear envelope [2]. An optically sectioned z-stack of the nucleus covering 100 µm was acquired with 532 nm excitation at 30 frames per second by stepwise increasing the piezo voltage in intervals of 0.1 V corresponding to a 100 nm sample displacement. Subsequently, fluorescent beads were dynamically tracked throughout the same nucleus using 640 nm excitation at a frame rate of 20 Hz. Light sheet illumination suppressed most of the strong background fluorescence typically observed at high sample depths. EM gain was deactivated during these experiments.

Figure 4 displays representative results from these experiments. Trajectories reconstructed from real-time tracking data were displayed in distinct colors in a 3D rendered scene (Supplementary Media 2), and aligned with the nuclear envelope signal acquired beforehand according to the piezo voltage for each frame. A significantly higher number of particle positions could be extracted from such data sets during the post-processing phase. Individual trajectories observed during live tracking comprised up to 4500 steps (Fig. 4(b) and Supplementary Media 3) and covered more than (10 µm)3. The high number of positions within each trajectory made it possible to obtain single particle tracking data like the jump distance distribution (JD) or mean square displacement (MSD) plots on the basis of individual particles rather than ensembles of single particles that might represent inhomogeneous populations regarding either particle or local environmental properties. Analysis of the MSD data for the trajectory shown in Fig. 4(b) yielded data consistent with Brownian motion with a diffusion coefficient D = 0.18 + - 0.02 µm2/s, which was confirmed by JD data (D = 0.15 + - 0.01 µm2/s). This analysis was conducted for a set of five real-time trajectories from the same nucleus, each of them containing at least 1000 positions with results of the MSD analysis ranging from Dmin = 0.18 µm2/s to Dmax = 0.25 µm2/s and a mean diffusion coefficient <D> = 0.21 + - 0.03 µm2/s (Fig. 4(c)). Variability in the determined diffusion coefficient for individual nanoparticles could be due to either different hydrodynamic radii of the particles, or variations in the local viscosity of the intranuclear medium.

 figure: Fig. 4

Fig. 4 (a) 3D rendering of the nuclear envelope of a C. tentans salivary gland cell nucleus (green) and real-time trajectories of 20 nm fluorescent beads. Each trajectory was shown in a different color (Supplementary Media 2). (b) Individual trajectories span more than 4500 frames and (10 µm)3 (Supplementary Media 3). View point rotated and x-y-axes inverted with respect to (a) for better visibility. (c) 3D mean square displacement analysis of 5 trajectories with more than 1000 positions. (d) Jump distance and mean square displacement (inset) analysis for the trajectory displayed in (b), yielding an unimodal JD histogram and similar MSD plots in x, y and z.

Download Full Size | PDF

Whereas the signal level was similar to the experiments presented in Fig. 3, background noise increased by a factor of three to 9 – 10 photons per pixel (standard deviation). The theoretical lateral localization precision for particles in the focal plane was σxy = 7.0 – 9.3 nm at 2500 – 3800 detected photons including the additional uncertainty introduced by particle motion [27]. Scaling the axial localization precision accordingly yields an estimate of σz ~40 nm. An upper limit is set by the distribution of axial localizations relative to the focal plane with a standard deviation σzrel = 95 nm. The difference between both values might be explained by the effect of particle motion on PSF width measurements which additionally hampers axial localization.

6. Conclusions

A major problem in standard 2D particle tracking experiments is that the observed particles rapidly leave the focal plane due to their stochastic movement. The average length of such trajectories can be estimated from the diffusion coefficient and the depth of focus. In our experiments nanoparticles with diffusion coefficient <D> would have traversed the depth of focus of the detection objective, DOF ≈1 µm, with a probability of 99.7% (3σ) within Δt=DOF2/(32×2×D)=0.8s corresponding to 16 frames at a frame rate of 20 Hz. Hence, only short trajectories could be acquired using conventional imaging. In this case dynamic tracking with a feedback loop increased the axial observation range and the trajectory duration for individual nanoparticles in a cellular environment by more than two orders of magnitude.

Dynamic 3D tracking techniques present a valuable extension to the nanoscopy toolbox, enabling researchers to observe trafficking of fluorescently labeled biomolecules [1, 13], viruses [7] or vesicles [14] in 3D space and in a meaningful environmental context, no longer limited to small volumes close to the coverslip surface or 2D projections of a particle’s complex spatiotemporal dynamics. Further, a high number of particle positions in a single trajectory as achieved by using a feedback loop allows for trafficking analysis based on complex motion models [31]. Using astigmatic detection and real-time image analysis enables feedback-based dynamic particle tracking as a very convenient and inexpensive extension to existing widefield instruments capable of single particle imaging. Apart from the PSF-shaping element, no further technical changes need to be made. The detection efficiency as well as the time-resolution of the system remained essentially unaffected, and neither image transformation nor alignment was needed in our setup. Images of continuous structures appeared slightly blurred due to the increased PSF diameter in focus (σast ~1.2 σ0), but were still clearly visible and preserved in position and shape (see e.g. the nuclear envelope in Supplementary Media 3). Combining this approach with light sheet microscopy greatly increased the accessible sample depth. With standard widefield illumination, overall background fluctuations would be significantly enhanced, the single particle signal-to-noise ratio reduced, and the intracellular reference structures blurred by background fluorescence. We anticipate the technique presented here to be especially fruitful in observing and characterizing trafficking phenomena and directed motion in 3D extended specimen like cell spheroids, small organisms or even full-sized embryos of model organisms. Naturally, the setup could also be used for 3D super resolution imaging in thick specimen [32].

We have successfully tested the technique on larger image fields and higher frame rates using a scientific CMOS camera. For frame rates beyond 100 Hz, piezo step response and system latency times might affect the performance of the technique [18], but could be circumvented using a real-time operating system or FPGA-based signal analysis and feedback control [33]. While we focus on tracking of individual particles in this manuscript, the technique could readily be extended to tracking of more than one target by switching between multiple sample planes as long as particle velocity and mutual distance do not exceed the switching capabilities of the mechanical parts of the system.

Supplementary Information

Supplementary Media 1

Time series of a 100 nm fluorescent bead dynamically tracked in PBS at 16 ms per frame and over 800 frames. The cyan square highlights the image area, in which the algorithm searched the particle. It was magnified in the inset. The white square in the inset shows the image area, from which centroid and spread of the intensity distribution were calculated. The frame number within the experiment is given in the top right corner, the color bar at the right hand side highlights the actual z displacement of the piezo stage relative to its maximum travel range (−50 to + 50 µm).Image field 20 µm x 20 µm, displayed at 33 Hz.

Supplementary Media 2

Animated 3D rendering of Fig. 4(a), containing the trajectory, which was followed in Supplementary Media 3 (yellow), further live localizations from the same experiment (distinct colors) and the nuclear envelope (green). Image series of a 100 nm fluorescent bead dynamically tracked in PBS at 16 ms per frame and over 800 frames. The cyan square highlights the image area, in which the algorithm looked for the particle. It was magnified in the inset. The white square in the inset shows the image area, from which centroid and spread of the intensity distribution were calculated. The frame number within the experiment is given in the top right corner, the color bar at the right hand side highlights the actual z displacement of the piezo stage relative to its maximum travel range (−50 to + 50 µm). Image field 20 µm x 20 µm, displayed at 33 Hz.

Supplementary Media 3

Time series of a 20 nm fluorescent bead dynamically tracked within the nucleus of a C. tentans salivary gland cell nucleus. Raw image data of the tracking experiment (red) were overlayed with the appropriate image frame from a z-stack showing the nuclear envelope (green), which was acquired beforehand. The nuclear envelope image was chosen according to the piezo voltage corresponding to each frame from the fluorescent particle images. The trajectory displayed in this image sequence was followed for 4540 frames or 227 seconds at 20 Hz and over an axial range of more than 10 µm. Other beads were visible in the same image sequence and were evaluated during post-processing. In the center of the nucleus, a negative image of a polytene chromosome was clearly visible around frame# 17550. Image field: 82 µm x 82 µm, displayed at 80 Hz.

Acknowledgments

We thank LaVision BioTec for providing the interface connecting the tracking DLL to ImSpector and Dr. J. Ritter for technical support in setting up the instrument. Funding was provided by German Research Foundation (grant Ku 2474/7-1), SFB624 “Templates – Functional Chemical Matrices” and German Ministry for Economics and Technology (ZIM KF2297601AK9). JHS and TK were supported by a fellowship from the German National Academic Foundation.

References and links

1. M. A. Thompson, J. M. Casolari, M. Badieirostami, P. O. Brown, and W. E. Moerner, “Three-dimensional tracking of single mRNA particles in Saccharomyces cerevisiae using a double-helix point spread function,” Proc. Natl. Acad. Sci. U.S.A. 107(42), 17864–17871 (2010). [CrossRef]   [PubMed]  

2. J. P. Siebrasse, R. Veith, A. Dobay, H. Leonhardt, B. Daneholt, and U. Kubitscheck, “Discontinuous movement of mRNP particles in nucleoplasmic regions devoid of chromatin,” Proc. Natl. Acad. Sci. U.S.A. 105(51), 20291–20296 (2008). [CrossRef]   [PubMed]  

3. N. Arhel, A. Genovesio, K.-A. Kim, S. Miko, E. Perret, J.-C. Olivo-Marin, S. Shorte, and P. Charneau, “Quantitative four-dimensional tracking of cytoplasmic and nuclear HIV-1 complexes,” Nat. Methods 3(10), 817–824 (2006). [CrossRef]   [PubMed]  

4. M. J. Saxton and K. Jacobson, “Single-particle tracking: applications to membrane dynamics,” Annu. Rev. Biophys. Biomol. Struct. 26(1), 373–399 (1997). [CrossRef]   [PubMed]  

5. M. A. Thompson, M. D. Lew, M. Badieirostami, and W. E. Moerner, “Localizing and tracking single nanoscale emitters in three dimensions with high spatiotemporal resolution using a double-helix point spread function,” Nano Lett. 10(1), 211–218 (2010). [CrossRef]   [PubMed]  

6. H. P. Kao and A. S. Verkman, “Tracking of single fluorescent particles in three dimensions: use of cylindrical optics to encode particle position,” Biophys. J. 67(3), 1291–1300 (1994). [CrossRef]   [PubMed]  

7. Y. Katayama, O. Burkacky, M. Meyer, C. Bräuchle, E. Gratton, and D. C. Lamb, “Real-time nanomicroscopy via three-dimensional single-particle tracking,” ChemPhysChem 10(14), 2458–2464 (2009). [CrossRef]   [PubMed]  

8. G. A. Lessard, P. M. Goodwin, and J. H. Werner, “Three-dimensional tracking of individual quantum dots,ˮ,” Appl. Phys. Lett. 91(22), 224106 (2007). [CrossRef]  

9. K. McHale, A. J. Berglund, and H. Mabuchi, “Quantum dot photon statistics measured by three-dimensional particle tracking,” Nano Lett. 7(11), 3535–3539 (2007). [CrossRef]   [PubMed]  

10. N. P. Wells, G. A. Lessard, P. M. Goodwin, M. E. Phipps, P. J. Cutler, D. S. Lidke, B. S. Wilson, and J. H. Werner, “Time-resolved three-dimensional molecular tracking in live cells,” Nano Lett. 10(11), 4732–4737 (2010). [CrossRef]   [PubMed]  

11. M. D. McMahon, A. J. Berglund, P. Carmichael, J. J. McClelland, and J. A. Liddle, “3D particle trajectories observed by orthogonal tracking microscopy,” ACS Nano 3(3), 609–614 (2009). [CrossRef]   [PubMed]  

12. E. Toprak, H. Balci, B. H. Blehm, and P. R. Selvin, “Three-dimensional particle tracking via bifocal imaging,” Nano Lett. 7(7), 2043–2045 (2007). [CrossRef]   [PubMed]  

13. S. Ram, P. Prabhat, J. Chao, E. S. Ward, and R. J. Ober, “High accuracy 3D quantum dot tracking with multifocal plane microscopy for the study of fast intracellular dynamics in live cells,” Biophys. J. 95(12), 6025–6043 (2008). [CrossRef]   [PubMed]  

14. Y. Sun, J. D. McKenna, J. M. Murray, E. M. Ostap, and Y. E. Goldman, “Parallax: high accuracy three-dimensional single molecule tracking using split images,” Nano Lett. 9(7), 2676–2682 (2009). [CrossRef]   [PubMed]  

15. G. Bolognesi, S. Bianchi, and R. Di Leonardo, “Digital holographic tracking of microprobes for multipoint viscosity measurements,” Opt. Express 19(20), 19245–19254 (2011). [CrossRef]   [PubMed]  

16. L. Holtzer, T. Meckel, and T. Schmidt, “Nanometric three-dimensional tracking of individual quantum dots in cells,ˮ,” Appl. Phys. Lett. 90(5), 053902 (2007). [CrossRef]  

17. D. Ernst, S. Hain, and J. Köhler, “Setup for single-particle orbit tracking: artifacts and corrections,” J. Opt. Soc. Am. A 29(7), 1277–1287 (2012). [CrossRef]   [PubMed]  

18. M. F. Juette and J. Bewersdorf, “Three-dimensional tracking of single fluorescent particles with submillisecond temporal resolution,” Nano Lett. 10(11), 4657–4663 (2010). [CrossRef]   [PubMed]  

19. J. G. Ritter, R. Veith, A. Veenendaal, J. P. Siebrasse, and U. Kubitscheck, “Light sheet microscopy for single molecule tracking in living tissue,” PLoS ONE 5(7), e11639 (2010). [CrossRef]   [PubMed]  

20. J. P. Siebrasse, T. Kaminski, and U. Kubitscheck, “Nuclear export of single native mRNA molecules observed by light sheet fluorescence microscopy,” Proc. Natl. Acad. Sci. U.S.A. 109(24), 9426–9431 (2012). [CrossRef]   [PubMed]  

21. J. G. Ritter, R. Veith, J.-P. Siebrasse, and U. Kubitscheck, “High-contrast single-particle tracking by selective focal plane illumination microscopy,” Opt. Express 16(10), 7142–7152 (2008). [CrossRef]   [PubMed]  

22. Y. Deng and J. W. Shaevitz, “Effect of aberration on height calibration in three-dimensional localization-based microscopy and particle tracking,” Appl. Opt. 48(10), 1886–1890 (2009). [CrossRef]   [PubMed]  

23. J. G. Ritter, J.-H. Spille, T. Kaminski, and U. Kubitscheck, “A cylindrical zoom lens unit for adjustable optical sectioning in light sheet microscopy,” Biomed. Opt. Express 2(1), 185–193 (2011). [CrossRef]   [PubMed]  

24. J. Huisken and D. Y. Stainier, “Even fluorescence excitation by multidirectional selective plane illumination microscopy (mSPIM),” Opt. Lett. 32(17), 2608–2610 (2007). [CrossRef]   [PubMed]  

25. M. K. Cheezum, W. F. Walker, and W. H. Guilford, “Quantitative comparison of algorithms for tracking single fluorescent particles,” Biophys. J. 81(4), 2378–2388 (2001). [CrossRef]   [PubMed]  

26. I. Izeddin, M. El Beheiry, J. Andilla, D. Ciepielewski, X. Darzacq, and M. Dahan, “PSF shaping using adaptive optics for three-dimensional single-molecule super-resolution imaging and tracking,” Opt. Express 20(5), 4957–4967 (2012). [CrossRef]   [PubMed]  

27. H. Deschout, K. Neyts, and K. Braeckmans, “The influence of movement on the localization precision of sub-resolution particles in fluorescence microscopy,” J Biophotonics 5(1), 97–109 (2012). [CrossRef]   [PubMed]  

28. M. J. Mlodzianoski, M. F. Juette, G. L. Beane, and J. Bewersdorf, “Experimental characterization of 3D localization techniques for particle-tracking and super-resolution microscopy,” Opt. Express 17(10), 8264–8277 (2009). [CrossRef]   [PubMed]  

29. C. von Middendorff, A. Egner, C. Geisler, S. W. Hell, and A. Schönle, “Isotropic 3D nanoscopy based on single emitter switching,” Opt. Express 16(25), 20774–20788 (2008). [CrossRef]   [PubMed]  

30. R. Veith, T. Sorkalla, E. Baumgart, J. Anzt, H. Häberlein, S. Tyagi, J. P. Siebrasse, and U. Kubitscheck, “Balbiani ring mRNPs diffuse through and bind to clusters of large intranuclear molecular structures,” Biophys. J. 99(8), 2676–2685 (2010). [CrossRef]   [PubMed]  

31. D. Ernst, M. Hellmann, J. Kohler, and M. Weiss, “Fractional Brownian motion in crowded fluids,ˮ,” Soft Matter 8(18), 4886–4889 (2012). [CrossRef]  

32. F. Cella Zanacchi, Z. Lavagnino, M. Perrone Donnorso, A. Del Bue, L. Furia, M. Faretta, and A. Diaspro, “Live-cell 3D super-resolution imaging in thick biological samples,” Nat. Methods 8(12), 1047–1049 (2011). [CrossRef]   [PubMed]  

33. A. P. Fields and A. E. Cohen, “Electrokinetic trapping at the one nanometer limit,” Proc. Natl. Acad. Sci. U.S.A. 108(22), 8937–8942 (2011). [CrossRef]   [PubMed]  

Supplementary Material (3)

Media 1: MPG (3538 KB)     
Media 2: MPG (5842 KB)     
Media 3: MPG (4618 KB)     

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (4)

Fig. 1
Fig. 1 (a) Schematic representation of the instrument. Illumination light was focused into the sample chamber by a 10x apochromat objective to form a thin light sheet in the focal plane (z0). The sample could be positioned coarsely by a 3-axes translation stage (X, Y, Z) and more accurately by an additional z piezo stage (Zp). Fluorescence was collected by a high NA 40x water-immersion objective, cleared up by notch and bandpass filters (NF, BF) and detected with a camera (EMCCD). A cylindrical lens (C) was used to shape the PSF for 3D localization. Fast image analysis was used to determine this information in real-time and feed it back to the z piezo stage in an active feedback loop to keep a particle near the focal plane. (b) Detailed view of the specimen within the sample chamber. Fluorescent particles located below (1) or above (3) the focal plane resulted in an elliptical PSF as observed on the camera chip (z0).
Fig. 2
Fig. 2 (a) x-y-cross sections of the astigmatic PSF at different axial positions. Scale bar 1 µm. (b) The spread of the astigmatic PSF, wx (squares) and wy (circles), was determined by least squares fitting of an elliptical Gaussian and the results were fit by 3rd order polynomials (solid lines) yielding unambiguous information on the axial localization over an axial range > 1 µm. (c) For a smaller range (> 0.6 µm), a linear approximation of the ellipticity wx / wy represented a fast estimator for axial localization, which required only the slope as a parameter, since per definition the intercept ϵ(z = 0) = 1.
Fig. 3
Fig. 3 (a) An immobilized fluorescent bead was moved stepwise through the focal plane to determine the experimental localization precision. The bead position was determined in 100 frames per step (full black line), the mean position was marked in light gray. (b) Histograms of all localizations relative to the mean value in one plane were fit by a Gaussian function (gray) with the standard deviation σ quantifying the localization precision: σx = 4.5 nm, σy = 4.4 nm and σz = 19 nm. (c) A 100 nm nanoparticle diluted in PBS was dynamically tracked for almost 13 s (Supplementary Media 1). The x (light gray), y (dark gray) and z (black) coordinates of the bead were used to reconstruct the 3D trajectory (d, black line) with about 800 positions spanning 10 µm x 10 µm x 60 µm. 2D projections were shown to exhibit the 3D coordinates of the trajectory.
Fig. 4
Fig. 4 (a) 3D rendering of the nuclear envelope of a C. tentans salivary gland cell nucleus (green) and real-time trajectories of 20 nm fluorescent beads. Each trajectory was shown in a different color (Supplementary Media 2). (b) Individual trajectories span more than 4500 frames and (10 µm)3 (Supplementary Media 3). View point rotated and x-y-axes inverted with respect to (a) for better visibility. (c) 3D mean square displacement analysis of 5 trajectories with more than 1000 positions. (d) Jump distance and mean square displacement (inset) analysis for the trajectory displayed in (b), yielding an unimodal JD histogram and similar MSD plots in x, y and z.

Equations (2)

Equations on this page are rendered with MathJax. Learn more.

( x c , y c )= | i x 0 |5; | j y 0 |5 ( I(i,j) I bg ) ×( i,j ) / | i x 0 |5; | j y 0 |5 ( I(i,j) I bg ) ,
( w x , w y ) 2 = | i x c |5; | j y c |5 ( I(i,j) I bg ) × ( i,j ) 2 / | i x c |5; | j y c |5 ( I(i,j) I bg ) .
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.