Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

4D holographic microscopy of zebrafish larvae microcirculation

Open Access Open Access

Abstract

An original technique that combines digital holography, dual illumination of the sample and cleaning algorithm 3D reconstruction is proposed. It uses a standard transmission microscopy setup coupled with a digital holography detection. The technique is 4D, since it allows to determine, at each time step, the 3D locations (x,y,z) of many moving objects that scatter the dual illumination beam. The technique has been validated by imaging the microcirculation of blood in a fish larvae sample (the moving objects are thus red blood cells RBCs). Videos showing in 4D the moving RBCs superimposed with the perfused blood vessels are obtained.

© 2016 Optical Society of America

1. Introduction

Blood flow imaging techniques are widely used in bio-medical studies, since they can assess physiological processes or can be used for early detection of disease [1, 2, 3]. However, many blood flow studies require, for imaging purposes, the use of a contrast agent, making blood flow characterization invasive [4, 5]. Scanning Doppler imaging techniques can be considered to alleviate this issue, but due to the scanning step, acquisition of an image is a time consuming process [6]. By analyzing the spatial statistics of the dynamic speckle with the Laser Speckle Contrast Analysis/Imaging (LASCA/LSCI) technique [7] the blood flow can be imaged [8, 9]. Improvement of the acquired contrast image has been achieved through exposure time optimization [10], or intensity fluctuation analysis [11] resulting in high quality perfusion images. However, these techniques are limited to perfusion monitoring in 2D and unable to reconstruct the structure of the blood flow in 3D.

Proposed by Gabor [12] and improved by Leith and Upatnieks [13] holography records the interference of the field scattered by an object with a known reference field. It is then possible to extract, from this interference pattern, the scattered field that reaches the holographic detector. This is particularly easy in digital holography that uses a camera detector. Holography is intrinsically a 3D technique since the Maxwell equations can be used to back propagate the field from the camera to any point of the 3D space [14]. This allows to localize and track a point-like object that scatters light, since its position coincides with the maximum of intensity of the reconstructed field [15]. The localization accuracy along the z axis depends strongly on the experimental conditions, in particular on the detection numerical aperture. Nanometric accuracy can be reached in holographic microscopy with a high numerical aperture (NA 1) objective [16, 17].

Digital holography has been used to image and qualitatively analyze blood flows in 2D [18, 19]. 3D holographic imaging of blood flow is challenging for a number of reasons. The biologically relevant size of the microcirculation makes it impossible to use a high NA objective, since the field of view would be too small. Blood cells are much larger than the wavelength, and their refractive index is close to the plasma one. The light is then scattered within a small angle in the forward direction. In transmission geometry, this corresponds to a small NA and to a low z accuracy. In reflection geometry, this leads to a low signal, which competes with light scattered by the surrounding living tissues, whose refractive index is not homogeneous in time and space.

In this paper, we have imaged in 3D moving red blood cells (RBCs) in a living fish larvae. We used an original technique that combines digital holography, illumination of the sample and reconstruction along several axis [20], and calculations that involve both standard holographic propagation and 3D reconstruction by a cleaning algorithm [21, 22, 23]. More specifically, the experiment was developed with an holographic microscopy setup working in transmission. To increase the angle diversity of the light scattered by the sample, the illumination was made by two beams that were angularly tilted with respect to the optical axis. At each time point, the difference of successive images was calculated yielding 2-frame holograms that contain information on the light that is scattered by the moving objects (i.e. the RBCs) at that time. Two holograms, that describe the field components of each illumination beam, were then extracted from the data. These two holograms were used to calculate the two scattered fields 3D maps corresponding to each illumination. Since these two maps describe the scattering made by the same RBCs, they must exhibit correlation maximums in locations (x,y,z) that correspond to RBCs. We used thus these correlation maximums to calculate the 3D map of the RBC locations. We got then 3D videos of the moving RBCs and, by averaging, 3D images of perfused blood vessels.

2. Experimental setup

Our holography set-up is similar to the one of the reference [19] (see Fig. 1). The set-up uses an upright microscope (Olympus® CX41) that has been modified to perform heterodyne holography [24] in transmission. The main laser (HL6545MG: 60 mW @ λ = 660 nm, optical frequency ωI) is split into two arms (illumination and reference) by a beam splitter BS. Two acousto-optic modulators (AOM) at ω1,2/2π ≃ 80 MHz control the frequency ωLO of the reference (i.e. local oscillator) field ELO:

ωLO=ωI+ω1ω2

 figure: Fig. 1

Fig. 1 Heterodyne digital holographic microscopy experimental arrangement. AOM1, AOM2: acousto-optic modulators (Bragg cells) that shift the frequency of the local oscillator beam; MO: microscope objective; M: mirror; BS: angularly tilted cube beam splitter. E and ELO: signal and reference complex fields.

Download Full Size | PDF

The illumination arm (field E) is split into two branches so as to illuminate the sample in two directions, whose relative angle is 28.3°. The sample is a zebrafish larvae (150 hours-old) between slide and cover slip. It is imaged with a microscope objective MO (NA= 0.30, G=10). To achieve off-axis holography, the beam splitter that recombines the signal (E) and reference (ELO) fields is angularly tilted: (θ ≠ 0). The camera (Mikrotron Eosens CL: 1280 × 1024 pixels, ωCAM/2π = 220 Hz, 10 bits, 14 μm square pitch) records the signal versus reference interference pattern Im = |E + ELO|2, where m = 0…M (with M = 128 to 1024) is the frame or time index. The recorded data are cropped into a 1024 × 1024 calculation grid to perform the holographic calculations involving digital Fourier transforms (FFT).

Zebrafishes (Danio rerio, wild type AB line) were maintained according to standard protocols [25]. Larvae were mounted on a standard glass slide under a 22 mm 1.5 round coverslip. A 0.5 mm thick caoutchouc ring served as a spacer and sealant. They were embedded in a drop of cooling 1.5% low melting point agarose at 37°C and oriented before gelation. The chamber was filled with 100 mg/l tricaine in filtered tank water and imaging was performed at room temperature. Blood vessels nomenclature is according to [26].

3. Data filtering and reconstruction

3.1. 4 phase reconstruction of the MO pupil image and selection of the signal corresponding to each illumination direction

We first performed a test experience, in which the field scattered by the sample is detected by 4 phases heterodyne holography. The detection is then made at the illumination frequency, and the holographic signal corresponds to the field scattered by immobile or almost immobile scatterers of the sample (ground glass or zebrafish larvae). The resulting signal is strong and allows for precise optical adjustment of our holographic device. This test experiment is also useful to illustrate the reconstruction procedure.

This procedure is similar to the one used in previous works [27, 28] (for details see [29]), but with small differences due to the double illumination, and to the need of selecting the signals corresponding to each illumination. In the test experiment, we tuned the LO frequency ωLO to perform 4 phase detection, and we considered 4 phase holograms H4φ. We have thus:

ωLO=ωI+ωCAM/4H4φ=[I0I2]+j[I1I3]
where I0, I1I3 are consecutive camera frames, and j2 = −1. To select the +1 holographic grating order and to compensate for the MO pupil phase curvature, we performed the holographic reconstruction of the MO pupil by the Schnars et al. method [14] yielding H˜4φ(kx,ky) with:
H˜4φ(kx,ky)=FFT[H4φ(x,y)ejk(x2+y2)/2d]
where FFT is the discrete Fourier transform operator, and k is the modulus of the wave-vector: k = 2π/λ. Here, ejk(x2+y2)/2d is the quadratic phase factor that determines the reconstruction plane. It depends on the curvature of the reference wave front, and on the distance from the camera to the reconstruction plane. Note that a phase quadratic in kx and ky that affects H4φ is missing in Eq. (2) (with respect to the Schnars et al. work [14]). This missing phase does not affect the intensity images |H˜4φ(kx,ky)|2.

Figures 2(a) and 2(b) show the reconstructed images of the pupil. The display is made in arbitrary log scale for intensity: |H˜4φ|2. Images are obtained with a ground glass (Fig. 2(a)) and with a living zebrafish sample (Fig. 2(b)). Figure 2(a) exhibits a circular zone whose brightness is high and homogeneous. This zone corresponds to the image of the MO pupil that is back illuminated through the ground glass. The brightness is homogeneous because the ground glass scatters light over angles wider than the MO collection angle. Because of the off-axis configuration (angular tilt θ of BS), the pupil image is located in the upper right side of the calculation grid, but it is displayed in the center of Fig. 2(a). Note that diffusers do not move in the ground glass experiment. It results that the scattered field E does not vary with time, and that the 4-phase demodulation made by Eq. 2 selects the +1 grating order term, since we have: H4φ=ELO*E. The −1 and 0 grating order images, that are zero, are not visible in Fig. 2(a).

 figure: Fig. 2

Fig. 2 (a, b) Zoom of the upper right hand corner of the intensity images |H˜4φ(kx,ky)|2 obtained with 4-phase holograms by reconstructing the MO pupil plane. (c) Fourier space hologram obtained after translation of the selected zone in the center of the calculation grid. The purple and green zones correspond to the holograms related to each illumination direction. The sample is a ground glass (a) or zebrafish sample (b,c).

Download Full Size | PDF

To get the images of Fig. 2, the reconstruction parameter d (used to calculate the phase factor ejk(x2+y2)/2d of Eq. (3)) has been adjusted to perform the reconstruction in the MO pupil plane. As seen, the pupil image exhibits sharp edges (white dashed line circle). The useful holographic information was then selected by cropping the pupil zone and by filling the remaining of the calculation grid with zeros. This corresponds to the Cuche et al. spatial filtering of the +1 grating order term EELO [30]. To compensate for the off-axis angular tilt, the cropped zone was translated into the center of the calculation grid.

Figure 2(b) was obtained by imaging a zebrafish sample. Because the sample was quite transparent, the two illumination beams yield two bright spots, which are well separated in the Fourier space (see arrows 1 and 2 in Fig. 2(b)). Around the spots, we can see two blurred brighter regions corresponding to the light scattered by the sample. Since these brighter regions are well separated, we extracted, from the holographic data H˜4φ, two holograms H˜purple and H˜green that describe the field scattered from each illumination direction. This was done by cropping, within H˜4φ, two circular zones of radius 240 pixels centered on each spot (dotted white circle in Fig. 2(c)) yielding the two holograms H˜purple and H˜green (see purple and green zones in Fig. 2(c)). Note that all the scattered light has been caught by either H˜purple or H˜green. The selection of the illumination direction was thus done without loss of information.

3.2. Holographic reconstruction of the field scattered by the circulating RBCs

In order to select the signal from moving RBCs, we considered 2-phase holograms H2φ recorded without frequency shift of the LO beam.

ωLO=ωIH2φ=I0I1

The signal resulting from an object that is not moving is not taken into account in H2φ. The images that are reconstructed from H2φ correspond then to the moving components of the sample, i.e. to RBCs. H˜2φ is calculated similarly to H˜4φ by:

H˜2φ(kx,ky)=FFT[ejk(x2+y2)/2dH2φ(x,y)]

Because H2φ(x,y) is real, H˜2φ exhibits both the +1 and −1 order images. Nevertheless, because of the off-axis configuration, these two images do not overlap. The +1 image is thus selected by cropping the pupil zone (whose exact location has been determined previously by the ground glass experiment), and by translating this zone into the center of the calculation grid. By this way, the useful holographic information is selected and the off-axis angular tilt is compensated. To separate the signals of the two illuminations, we performed, like in section 3.1, two crops of radius 240 pixels centered on the two illumination spots of Fig. 2(c). We obtained the two Fourier space holograms H˜purple(kx,ky) and H˜green(kx,ky), from which we have calculated the two holograms Hpurple(x,y) and Hgreen(x,y):

Hpurple(x,y)=FFT1H˜purple(kx,ky)Hgreen(x,y)=FFT1H˜green(kx,ky)
where FFT1 is the reverse Fourier transform operator. The holograms Hpurple(x,y,z) and Hgreen(x,y,z), reconstructed in planes z, are then calculated by the angular spectrum method [31]. This method involves two Fourier transforms from the real space holograms Hpurple(x,y) and Hgreen(x,y), but only one Fourier transform from the Fourier space holograms H˜purple(kx,ky) and H˜green(kx,ky). We have:
Hpurple(x,y,z)=FFT1ej(kx2+ky2)z/2kH˜purple(kx,ky)Hgreen(x,y,z)=FFT1ej(kx2+ky2)z/2kH˜green(kx,ky)
where FFT is the discrete Fourier transform operator. Here, x,y and kx,ky are discrete coordinates whose pitches are Δx and Δk, which obey to N Δx Δk = 2π where N = 1024 is the size of the calculation grid.

Up to now, we have considered that both acquisition and reconstruction were made in the image half-space (near the camera). The pitches are then Δx = Dpix and Δk = 2π/(NDpix) (where Dpix = 14 μm is the size of the camera pixels). x, y and z are the coordinates of the image of the sample (that is conjugated with the sample by MO). This point of view, which is most often adopted in the literature, is not very convenient, since we want to reconstruct the scattered fields with coordinates related to the sample.

A better point of view [29] is to consider that both acquisition and reconstruction are made in the object half-space (i.e. near the sample). H2φ(x,y) is then the hologram recorded in the plane of the object half-space that is conjugated with the camera by MO. All the mathematical transformations made to calculate Hpurple(x,y) and Hgreen(x,y) yield holograms, whose phases are properly corrected (i.e. the phase of the hologram is the same as the phase of the field). Equation (7) performs then the reconstruction with orthogonal coordinates x, y and z that correspond to the real coordinates near the sample. The pitches are Δx = Dpix/G, Δk = 2πG/(NDpix), where G is the imaging gain from the plane z = 0 to the camera. Note that G is generally not equal to the objective nominal gain (×10 here), since it depends on the exact location of the camera with respect to MO. The best is to measure the gain G by a calibration procedure.

In analyzing our data, we have considered the second point of view. We have measured G by imaging a USAF target. This calibration yields G = 24.6, Δx = 14 μm/24.6 = 0.569 μm and Δk = 2π/(1024Δx).

4. Dual illumination reconstructed images and videos

We imaged a zebrafish sample by recording, without frequency shift of the LO beam (ωLO = ωI), a sequence of frames Im (with m = 1…129). With each couple of consecutive frames: Im and Im+1, we calculated the 2-phase holograms H2φ,m = ImIm+1. With each hologram H2φ,m, we calculated Hpurple,m(x,y,z) and Hgreen,m(x,y,z) by Eqs. (5)(7). The video files Visualization 1, Visualization 2 and Visualization 3 show the intensity reconstructed images, whose components are |Hpurple,m(x,y,z)|2 and |Hgreen,m(x,y,z)|2, that were obtained, for m = 1 to 128, and for respectively z = 0, z = +26.7 and z = 53.5 μm.

The images 77 (time index m = 77) of these video files are displayed on Figs. 3(a)–3(c). In figure 3 (a), the reconstructed plane z = 0 corresponds to the locations of the caudal vein and caudal artery. Indeed, the purple and green components of the caudal vein and caudal artery images are superimposed yielding white patterns. In figure 3(b), with z = 26.7 μm, the large parallel longitudinal vessels (caudal vein and caudal artery) are out of the reconstruction plane, and the purple and green components are well separated. On the other hand, the dorsal capillary seen in the bottom left side of the image is within the reconstruction plane, and the purple and green images of the RBC pointed by the arrow coincides (see zoom in Fig. 3(b)). In figure 3 (b), with z = 53.5 μm, the purple and green components of both the caudal vein, caudal artery and dorsal capillaries are separated.

 figure: Fig. 3

Fig. 3 Reconstructed intensity image of RBCs made with the 2-phase hologram H2φ,m (see Eq. (4)) where the time index is m = 77. The purple and green components Hpurple,m(x,y,z) and Hgreen,m(x,y,z) of the images were obtained by selecting the purple and the green zones of Fig. 2(c). Reconstruction was made with z = 26.7 (a), z = 0 (b) and z = +26.7 μm (c). The displayed images (a..c) correspond to image 77 of file Visualization 1, Visualization 2 and Visualization 3. Dorsal side left, anterior to the top. Bar is 100 μm.

Download Full Size | PDF

To image the inside structure of the perfused blood vessels, corresponding to the entire trajectories of the RBCs along the observation time, we calculated the averaged intensity holograms 〈|Hpurple|2〉 and 〈|Hgreen|2〉:

|Hpurple(x,y,z)|2=(1/128)m=0127|Hpurple,m(x,y,z)|2
|Hgreen(x,y,z)|2=(1/128)m=0127|Hgreen,m(x,y,z)|2

The video file Visualization 4 shows the averaged intensity reconstructed images, whose components are 〈|Hpurple(x,y,z)|2〉 and 〈|Hgreen(x,y,z)|2〉. These images are obtained for z = 53.5 to z = +53.5 μm in 101 steps of 1.07 μm. Figure 4 shows the averaged intensity reconstructed images obtained for z = 0 (a), z = +26.7 (b) and z = +53.5 μm (c).

 figure: Fig. 4

Fig. 4 Averaged intensity reconstructed images made with 2 phases holograms: H2φ,m = Im − Im−1 and with ω1 = ω2. The purple and green components 〈|Hpurple(x,y,z)|2〉 and 〈|Hgreen(x,y,z)|2〉 are calculated with Eq. (8). Reconstruction was made with z = 0 (a), z = +26.7 (b) and z = +53.5 μm (c). The displayed images correspond to image 51 (a), 76 (b) and 101 (c) of file Visualization 4. Bar is 100 μm.

Download Full Size | PDF

5. 3D reconstruction by a cleaning algorithm

The intensity maps |Hpurple,m(x,y,z)|2 and |Hgreen,m(x,y,z)|2 are not quantitative 3D representations of the scatterers (i.e the RBCs), but are sequences of 2D images of the field scattered by the RBCs in different z planes, as showed in Fig. 6(a). |Hpurple,m(x,y,z)|2 corresponds to the purple beam that is scattered, while |Hgreen,m(x,y,z)|2 corresponds to the green beam. It’s only by comparing the signal scattered by the purple and green beams that we determined the 3D location of each RBC, and obtain thus a 3D image of the sample (see Fig. 6(b)) at the time tm, where tm is the recording time of frame m.

To calculate the RBCs locations at time step m, we considered that both holograms Hpurple and Hgreen result from a sum of K fields, scattered by K sources of field Spurple,k and Sgreen,k (with k = 1…K) that describe the scattering of the two beams by the RBC scatterer of index k. We consider thus here single scattering of K independent sources. The holograms Hpurple and Hgreen result thus from two 3D maps of the sources Spurple and Sgreen that scatter the illumination field:

Spurple(x,y,z)=k=1KSpurple,k(x,y,z)Sgreen(x,y,z)=k=1KSgreen,k(x,y,z)

Note that we have considered here that sources Spurple and Sgreen are highly correlated, since they represent the same scatterers.

In the calculations, we have considered a 3D calculation grid of 512×512×128 points in x, y and z, and K = 25000 RBC scatterers. This figure (K = 25000) corresponds to about 10% of the area of the calculation grid 512×512. The pitch is 2×0.569 = 1.138 μm in x and y and 0.68 μm in z. Moreover, we have considered RBCs scatterers whose cross section is one pixel. Spurple,k and Sgreen,k are thus zero everywhere except in the location of the kth RBC that is (xk,yk,zk). Let’s describe with more details the calculation algorithm we have used. This algorithm is illustrated by Fig. 5.

 figure: Fig. 5

Fig. 5 Scheme of the 3D reconstruction process by the cleaning algorithm. In (a) the product between the two 3D grids is calculated. This allows to select, in (b), the point of highest correlation x1;y1;z1. Thus, in (c), the point is stored in Spurple(x,y,z) and Sgreen(x,y,z) and erased from |Hpurple(x,y,z1)| and |Hgreen(x,y,z1)|. From this modified plane, in (d) the overall grid is recalculated. In (e) a new maximum correlation point x2;y2;z2 is found. Again, in (f), new sources are stored in the associated space of sources and set to zero in the plane z = z2 of the 3D grid. A new cycle will start and the operation will be repeated K times.

Download Full Size | PDF

  1. At each time step m, we first calculate the coordinates (x1,y1,z1) of the 1rst RBC. This is done by calculating Hpurple(x,y,z) and Hgreen(x,y,z) for the 512×512×128 points of the 3D grid: see Fig. 5(a). We assume then that the location (x1,y1,z1) of the 1rst RBC corresponds to a coincidence in the holographic signals Hpurple and Hgreen. Since the phase of the scattered fields strongly depends on the illumination direction, we consider intensities. We thus calculate the intensity product (|Hpurple|2×|Hgreen|2) in all points of the 3D grid, and say that the maximum of this product is reached in point (x1,y1,z1). We then consider the 2D holograms Hpurple(x,y,z1) and Hgreen(x,y,z1) in plane z1 (Fig. 5(b)), and we split them in two components (Fig. 5(c)):
    Hpurple(x,y,z1)=Hpurple,1(x,y,z1)+Spurple,1(x,y,z1)Hgreen(x,y,z1)=Hgreen,1(x,y,z1)+Sgreen,1(x,y,z1)

    The first 2D components Spurple,1 and Sgreen,1 describe the field scattered by the first RBC. In plane z1, Spurple,1 and Sgreen,1 are zero except in point x1,y1, where they are equal to Hpurple and Hgreen. On the other hand, Hpurple,1 and Hgreen,1 describe the fields scattered by the other RBCs. In plane z1, Hpurple,1 and Hgreen,1 are zero in point x1,y1, and equal to Hpurple and Hgreen in the other points.

  2. We then calculate the location (x2,y2,z2) of the second RBC by using Hpurple,1 and Hgreen,1, which are known in plane z1. This is done by calculating Hpurple,1 and Hgreen,1 in all points of the 3D calculation grid, and by assuming that the maximum of (|Hpurple,1|2×|Hgreen,1|2) is reached in point (x2,y2,z2): see Fig. 5(d). The holograms Hpurple,1 and Hgreen,1 in plane z2 (Fig. 5(e)) are split again in two components (Fig. 5(f)).
    Hpurple,1(x,y,z2)=Hpurple,2(x,y,z2)+Spurple,2(x,y,z2)Hgreen,1(x,y,z2)=Hgreen,2(x,y,z2)+Sgreen,2(x,y,z2)

    Spurple,2 and Sgreen,2 correspond to the second RBC, while Hpurple,2 and Hgreen,2 to RBCs of index k = 3…K. The algorithm used here is cleaning, because the field sources Spurple,1 and Sgreen,1 of the first RBC have been removed from the holographic data Hpurple,1 and Hgreen,1 that are used to calculate the holographic sources Spurple,k≥2 and Sgreen,k≥2 of the other RBCs.

  3. The following steps are similar to steps one and two. We calculate the location (xk,yk,zk) of the kth RBC by using Hpurple,k−1 and Hgreen,k−1, which are known in plane zk−1. Hpurple,k−1, Hgreen,k−1 are then calculated in all points. Maximum of (|Hpurple,k−1|2×|Hgreen,k|2) yields (xk,yk,zk). The holograms Hpurple,k−1 and Hgreen,k−1 in plane zk are split:
    Hpurple,k1(x,y,zk)=Hpurple,k(x,y,zk)+Spurple,k(x,y,zk)Hgreen,k1(x,y,zk)=Hgreen,k(x,y,z2)+Sgreen,k(x,y,zk)

    We get then the sources Spurple,k and Sgreen,k and the holograms Hpurple,k and Hgreen,k needed to continue the calculation.

  4. We get Spurple,k and Sgreen,k for k = 1…K i.e. for all RBCs. Spurple,k and Sgreen,k are then summed together by Eq. (10) yielding the 3D maps Spurple(x,y,z) and Sgreen,k(x,y,z).
  5. The calculation of Spurple and Spurple described above at time tm is made for all 2-phase holograms H2φ,m of the sequence. The Spurple and Sgreen maps are thus calculated in 4D, with coordinates x,y,z,t = tm.

The calculation described above is quite heavy. To get the location of the kth RBC scatterer, we must calculate the location of the maximum of |Hpurple,k|2×|Hpurple,k|2. It requires 2×128 = 256 discrete Fourier transforms (FFTs) of 512×512 points, to get Hpurple,k−1 and Hgreen,k−1 in all the z planes of the 512×512×128 calculation grid, plus some work to calculate |Hpurple,k|2×|Hpurple,k|2 for all points, and to determine the location of the maximum. To get Spurple and Sgreen at each time step tm, one need to locate K = 25000 RBCs scattered whose area is 1 pixel. This requires 256×2.5 104 = 6.4 106 FFTs, plus some other work. This calculation was done in about 80 mn with a NVidia GTX TITAN Graphics Processing Unit (GPU). In the video files presented further, we have consider sequences of m = 1…122 holograms. The GPU calculation was made in about 122×80 mn, i.e. 7 days.

6. The 3D map of the scatterers obtained by the cleaning algorithm

In order to illustrate how the dual illumination scheme and the cleaning algorithm quantitatively enhance the z-axis confinement of 3D blood vessel structures, we have displayed, in Fig. 6, the XY, YZ and XZ cuts of the averaged reconstructed 3D maps obtained without and with the cleaning algorithm i.e. 〈|Hpurple(x,y,z)|2〉 and 〈|Hgreen(x,y,z)|2〉 (without) and 〈|Spurple(x,y,z)|2〉 and 〈|Sgreen(x,y,z)|2〉 (with).

 figure: Fig. 6

Fig. 6 Results of the 3D reconstruction made without (a) and with (b) the cleaning algorithm. (a) cuts of 〈|Hpurple(x,y,z)|2〉 and 〈|Hgreen(x,y,z)|2〉 along the planes XY (upper-right image highlighted in blue), YZ (upper-left, red) and XZ (lower, yellow). (b) cuts of 〈|Spurple,θ|2〉 and 〈|Sgreen,θ|2〉 along the planes XY, YZ and XZ. In (a) and (b) the cuts correspond respectively to images 64 of the video file Visualization 5 and Visualization 6. The blue, red and yellow lines represent the relative positions of the planes XY, YZ and XZ.

Download Full Size | PDF

Without cleaning (see Fig. 6 (a)), the z confinement is poor. In the XZ and YZ cuts, the green and purple images of the blood vessels are angularly tilted, and the location of the vessels corresponds to the z location where the green and purple images coincide. With cleaning (see Fig. 6 (b)), the z confinement is much better. Since the green and purple images coincide, the images are displayed in white. The 3D maps obtained without and with cleaning are also displayed in the video files Visualization 5 and Visualization 6 (x, y and z are swept in 128 steps of pitch 8Δx and Δz). To image a quite large zone of the zebrafish blood vessel structure, the experiment was made with a NA=0.3 microscope objective. The purple versus green axis angle is thus lower than in [20] (see xz cut of Fig. 6 (a)), and the z confinement is thus lower too.

Note that the RBCs are much larger than one voxel, but, to keep all useful information, the cleaning algorithm was made by considering one voxel scatterers. K = 25000 corresponds thus to the number of voxels that is considered in the cleaning algorithm. This figure is enough to handle most of the scattered energy, and thus to image all the RBCs, whose number is much lower than K.

7. Visualization of the data calculated by the cleaning algorithm

The visualization of the 4D maps Spurple and Sgreen calculated by the cleaning algorithm is challenging, because of the 4 coordinates. To perform visualization, we have projected the instantaneous intensities |Spurple|2 and |Sgreen|2 (that are 3D maps of real numbers) along the direction uθ = (cosθ,1,sinθ) yielding, for each time step m and each direction uθ, 2D images that can be displayed. The projections are defined by:

|Spurple,θ(x,y,tm)|2=z|Spurple(xcosθ+zsinθ,y,z,tm)|2|Sgreen,θ(x,y,tm)|2=z|Sgreen(xcosθ+zsinθ,y,z,tm)|2

We have displayed the projection |Spurple,θ|2 in purple and |Sgreen,θ|2 in green in a video file of 122 images ( Visualization 7, Fig. 8), where the time index varies from m = 1 to 122 in 122 steps, while the angle varies from θ = 30° to +30° in 61 steps, and from θ = +30° to −30° in 61 steps. Figure 7 shows the images 1, 32 and 46 of the file Visualization 7, which correspond to time m = 1 and projection angle θ = 30° (a), m = 32 and θ = 0° (b), m = 46 and θ = +15° (c). The individual RBCs and its 3D motion can be seen in the video file.

 figure: Fig. 7

Fig. 7 Instantaneous projections |Spurple,θ|2 and |Sgreen,θ|2. The projections correspond to images 1, 32 and 46 of the video file Visualization 7, i.e. to time m = 1 and angle θ = 30° (a), m = 32 and θ = 0° (b), and m = 46 and θ = +15° (c). The display is made in arbitrary logarithmic scale: |Spurple,θ|2 is displayed in purple, |Sgreen,θ|2 in green. Bar is 100 μm.

Download Full Size | PDF

 figure: Fig. 8

Fig. 8 Averaged projection 〈|Spurple,θ|2〉 and 〈|Sgreen,θ|2〉. The projections correspond to images: 1, 41 and 61 of the video file Visualization 8 i.e. to time m = 1 and angle θ = 40° (a), m = 1 and θ = 0° (b), and θ = +20° (c). The display is made in arbitrary logarithmic scale: 〈|Spurple,θ|2〉 is displayed in purple, 〉|Sgreen,θ|2〉 in green. Bar is 100 μm.

Download Full Size | PDF

It should be stressed that the cleaning algorithm is nonlinear, in the sense that it emphasizes the moving scatterers, whose signal is highest, which happen to be here the RBCs. Lower contributions to the signal such as vasomotion have not been detected.

To visualize the entire trajectories of the RBCs, that correspond to the inside of the perfused blood vessels, we have calculated the time averaged intensity holograms 〈|Hpurple|2〉 and 〈|Hgreen|2〉. We got:

|Spurple,θ(x,y)|2=1122m=1122|Spurple,θ(x,y,tm)|2|Sgreen,θ(x,y)|2=1122m=1122|Sgreen,θ(x,y,tm)|2

We have displayed 〈|Spurple,θ|2〉 and 〈|Sgreen,θ|2〉 in a video file of 162 images ( Visualization 8), where the angle varies from θ = 40° to +40° in 81 steps, and from θ = +40° to −40° in 81 steps. Figure 7 shows the images 1 (a), 41 (b) and 61 (c) of file Visualization 8, which correspond to θ = 40° (a), θ = 0° (b), and θ = +20° (c). The 3D character of the blood vessels is clearly seen in the video file since the images of the blood vessels depends on the projection angle θ.

To better visualize the 3D character of the motion of the RBCs, we have superimposed in a video file ( Visualization 9) the averaged projections 〈|Spurple,θ|2〉 and 〈|Sgreen,θ|2〉 that show the perfused blood vessels, with the instantaneous projections |Spurple,θ|2 and |Sgreen,θ|2 that show the motion of individuals RBCs. In the video file Visualization 9, the time index varies from m = 1 to 122 in 122 steps, while the angle varies from θ = 30° to +30° in 61 steps, and from θ = +30° to 30° in 61 steps. The projections are displayed in arbitrary logarithmic scale: the averaged projections 〈|Spurple,θ|2〉 and 〈|Sgreen,θ|2〉 are displayed in blue and red, while the instantaneous projections |Spurple,θ|2 and |Sgreen,θ|2 are both displayed in green. With this choice of colors, the blood vessels are seen in purple, while the individual RBCs are white. Figure 9 shows the images 1 (a), 32 (b) and 46 (c) of file Visualization 9, which correspond to time index m = 1 and projection angle θ = 30° (a), m = 32 and θ = 0° (b), m = 46 and θ = +15° (c). The 3D character of the blood vessels is clearly seen in the video file. The 3D character of the RBCs motion is seen visible. As expected, the RBCs follow the blood vessels paths.

 figure: Fig. 9

Fig. 9 Averaged projections 〈|Spurple,θ|2〉 and 〈|Sgreen,θ|2〉 superimposed with the instantaneous projections |Spurple,θ|2 and |Sgreen,θ|2. The projections correspond to images 1, 32 and 46 of file Visualization 9, i.e. to time index m = 1 and projection angle θ = −30° (a), m = 32 and θ = 0° (b), m = 46 and θ = +15° (c). The display is made in arbitrary logarithmic scale: 〈|Spurple,θ|2〉 is displayed in purple, 〉|Sgreen,θ|2〉 in green. Bar is 100 μm.

Download Full Size | PDF

8. Conclusion

In this paper, we proposed an original imaging technique that combines a digital holographic microscopy setup with dual illumination of the sample, and calculations involving both standard holographic reconstruction and 4D calculations by a cleaning algorithm. The setup uses an upright microscope that has been transformed into an heterodyne holographic device [24] working in transmission. The holograms Hpurple and Hgreen that correspond to the scattering of each illumination beams were extracted from the camera recorded data Im. By making the hypothesis that the two holograms Hpurple and Hgreen were generated by the same scatterers (i.e. the moving red blood cells), the location of the scatterers (xk,yk and zk) and their complex scattering amplitudes Spurple and Sgreen were calculated by a cleaning algorithm. The amplitudes Spurple and Sgreen are 4D data (x,y,z and t) that were used to generate video files showing in 4D the RBCs ( Visualization 7), the blood vessels ( Visualization 8) and the RBCs superimposed with the blood vessels ( Visualization 9).

In comparison with other techniques dedicated to monitor in vivo blood flow, including laser Doppler and laser speckle, the proposed approach has the major advantage, specifically thanks to the recording of holograms, to be able to measure 3D aspects of the RBC movements. The proposed method can be still improved by using a more efficient cleaning algorithm for the 3D reconstruction and a setup allowing a larger angle of separation of the two illuminations beams. These improvements should yield faster calculations and higher z resolution. Future work should allow to calculate 3D maps of velocity vectors by analyzing Hpurple and Hgreen at successive instants of time tm and tm+1.

Funding

Labex Numev (convention ANR-10-LABX-20)(2014-1-042).

Acknowledgments

We acknowledge O. Strauss for fruitful discussions and NVIDIA Corporation for the donation of the GTX Titan GPU.

References and links

1. E. Friedman, S. Krupsky, A. Lane, S. Oak, E. Friedman, K. Egan, and E. Gragoudas, “Ocular blood flow velocity in age-related macular degeneration,” Ophthalmology 102, 640–646 (1995). [CrossRef]   [PubMed]  

2. M. P. Pase, N. A. Grima, C. K. Stough, A. Scholey, and A. Pipingas, “Cardiovascular disease risk and cerebral blood flow velocity,” Stroke 43, 2803–2805 (2012). [CrossRef]   [PubMed]  

3. J. Kur, E. A. Newman, and T. Chan-Ling, “Cellular and physiological mechanisms underlying blood flow regulation in the retina and choroid in health and disease,” Progress in retinal and eye research 31,377–406 (2012). [CrossRef]   [PubMed]  

4. O. Sakurada, C. Kennedy, J. Jehle, J. Brown, G. L. Carbin, and L. Sokoloff, “Measurement of local cerebral blood flow with iodo [14c] antipyrine,” Am. J. Physiology-Heart and Circulatory Physiology 234, H59–H66 (1978).

5. I. Kanno, H. Iida, S. Miura, M. Murakami, K. Takahashi, H. Sasaki, A. Inugami, F. Shishido, and K. Uemura, “A system for cerebral blood flow measurement using an h215o autoradiographic method and positron emission tomography,” J. Cerebral Blood Flow & Metabolism 7, 143–153 (1987). [CrossRef]  

6. Y. Yeh and H. Cummins, “Localized fluid flow measurements with an he-ne laser spectrometer,” Appl. Phys. Lett. 4, 176–178 (1964). [CrossRef]  

7. J. D. Briers and S. Webster, “Laser speckle contrast analysis (lasca): a nonscanning, full-field technique for monitoring capillary blood flow,” J. Biomed. Opt. 1, 174–179 (1996). [CrossRef]   [PubMed]  

8. A. K. Dunn, “Laser speckle contrast imaging of cerebral blood flow,” An. Biomedical Engineering 40, 367–377 (2012). [CrossRef]  

9. D. Briers, D. D. Duncan, E. Hirst, S. J. Kirkpatrick, M. Larsson, W. Steenbergen, T. Stromberg, and O. B. Thompson, “Laser speckle contrast imaging: theoretical and practical limitations,” J. Biomed. Opt. 18, 066018 (2013). [CrossRef]   [PubMed]  

10. S. Yuan, A. Devor, D. A. Boas, and A. K. Dunn, “Determination of optimal exposure time for imaging of blood flow changes with laser speckle contrast imaging,” Appl. Opt. 44, 1823–1830 (2005). [CrossRef]   [PubMed]  

11. Y. Zeng, M. Wang, G. Feng, X. Liang, and G. Yang, “Laser speckle imaging based on intensity fluctuation modulation,” Opt. Lett. 38, 1313–1315 (2013). [CrossRef]   [PubMed]  

12. D. Gabor, “A new microscopic principle,” Nature 161, 777–778 (1948). [CrossRef]   [PubMed]  

13. E. N. Leith and J. Upatnieks, “Reconstructed wavefronts and communication theory,” J. Opt. Soc. Am. A 52, 1123–1128 (1962). [CrossRef]  

14. U. Schnars and W. Jüptner, “Direct recording of holograms by a ccd target and numerical reconstruction,” Appl. Opt. 33, 179–181 (1994). [CrossRef]   [PubMed]  

15. M. Atlan, M. Gross, P. Desbiolles, É. Absil, G. Tessier, and M. Coppey-Moisan, “Heterodyne holographic microscopy of gold particles,” Opt. lett. 33, 500–502 (2008). [CrossRef]   [PubMed]  

16. S.-H. Lee, Y. Roichman, G.-R. Yi, S.-H. Kim, S.-M. Yang, A. van Blaaderen, P. van Oostrum, and D. G. Grier, “Characterizing and tracking single colloidal particles with video holographic microscopy,” Opt. Express 15, 18275–18282 (2007). [CrossRef]   [PubMed]  

17. F. C. Cheong, B. J. Krishnatreya, and D. G. Grier, “Strategies for three-dimensional particle tracking with holographic video microscopy,” Opt. Express 18, 13563–13573 (2010). [CrossRef]   [PubMed]  

18. J. Gao, J. A. Lyon, D. P. Szeto, and J. Chen, “In vivo imaging and quantitative analysis of zebrafish embryos by digital holographic microscopy,” Biomed. Opt. Express 3, 2623–2635 (2012). [CrossRef]   [PubMed]  

19. N. Verrier, D. Alexandre, and M. Gross, “Laser doppler holographic microscopy in transmission: application to fish embryo imaging,” Opt. Express 22, 9368–9379 (2014). [CrossRef]   [PubMed]  

20. F. Saglimbeni, S. Bianchi, A. Lepore, and R. Di Leonardo, “Three-axis digital holographic microscopy for high speed volumetric imaging,” Opt. Express 22, 13710–13718 (2014). [CrossRef]   [PubMed]  

21. J. Högbom, “Aperture synthesis with a non-regular distribution of interferometer baselines,” Astronomy and Astrophysics Supplement Series 15, 417 (1974).

22. F. Soulez, L. Denis, C. Fournier, É. Thiébaut, and C. Goepfert, “Inverse-problem approach for particle digital holography: accurate location based on local optimization,” J. Opt. Soc. Am. A 24, 1164–1171 (2007). [CrossRef]  

23. F. Soulez, L. Denis, E. Thiébaut, C. Fournier, and C. Goepfert, “Inverse problem approach in particle digital holography: out-of-field particle detection made possible,” J. Opt. Soc. Am. A 24, 3708–3716 (2007). [CrossRef]  

24. F. Le Clerc, L. Collot, and M. Gross, “Numerical heterodyne holography with two-dimensional photodetector arrays,” Opt. Lett. 25, 716–718 (2000). [CrossRef]  

25. M. Westerfield, “The Zebrafish Book: a Guide for the Laboratory use of Zebrafish (Danio rerio)” (Institute of Neuroscience. University of Oregon, 1995).

26. S. Isogai, M. Horiguchi, and B. M. Weinstein, “The vascular anatomy of the developing zebrafish: an atlas of embryonic and early larval development,” Developmental biology 230, 278–301 (2001). [CrossRef]   [PubMed]  

27. N. Warnasooriya, F. Joud, P. Bun, G. Tessier, M. Coppey-Moisan, P. Desbiolles, M. Atlan, M. Abboud, and M. Gross, “Imaging gold nanoparticles in living cell environments using heterodyne digital holographic microscopy,” Opt. Express 18, 3264–3273 (2010). [CrossRef]   [PubMed]  

28. F. Verpillat, F. Joud, P. Desbiolles, and M. Gross, “Dark-field digital holographic microscopy for 3d-tracking of gold nanoparticles,” Opt. Express 19, 26044–26055 (2011). [CrossRef]  

29. N. Verrier, D. Alexandre, G. Tessier, and M. Gross, “Holographic microscopy reconstruction in both object and image half-spaces with an undistorted three-dimensional grid,” Appl. Opt. 54, 4672–4677 (2015). [CrossRef]   [PubMed]  

30. E. Cuche, P. Marquet, and C. Depeursinge, “Spatial filtering for zero-order and twin-image elimination in digital off-axis holography,” Appl. Opt. 39, 4070–4075 (2000). [CrossRef]  

31. L. Yu and M. K. Kim, “Wavelength-scanning digital interference holography for tomographic three-dimensional imaging by use of the angular spectrum method,” Opt. lett. 30, 2092–2094 (2005). [CrossRef]   [PubMed]  

Supplementary Material (9)

NameDescription
Visualization 1: MP4 (8148 KB)      Visualization 1
Visualization 2: MP4 (589 KB)      Visualization 2
Visualization 3: MP4 (705 KB)      Visualization 3
Visualization 4: MP4 (70 KB)      Visualization 4
Visualization 5: MP4 (3687 KB)      Visualization 5
Visualization 6: MP4 (13598 KB)      Visualization 6
Visualization 7: MP4 (1362 KB)      Visualization 7
Visualization 8: MP4 (130 KB)      Visualization 8
Visualization 9: MP4 (776 KB)      Visualization 9

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (9)

Fig. 1
Fig. 1 Heterodyne digital holographic microscopy experimental arrangement. AOM1, AOM2: acousto-optic modulators (Bragg cells) that shift the frequency of the local oscillator beam; MO: microscope objective; M: mirror; BS: angularly tilted cube beam splitter. E and ELO: signal and reference complex fields.
Fig. 2
Fig. 2 (a, b) Zoom of the upper right hand corner of the intensity images | H ˜ 4 φ ( k x , k y ) | 2 obtained with 4-phase holograms by reconstructing the MO pupil plane. (c) Fourier space hologram obtained after translation of the selected zone in the center of the calculation grid. The purple and green zones correspond to the holograms related to each illumination direction. The sample is a ground glass (a) or zebrafish sample (b,c).
Fig. 3
Fig. 3 Reconstructed intensity image of RBCs made with the 2-phase hologram H2φ,m (see Eq. (4)) where the time index is m = 77. The purple and green components Hpurple,m(x,y,z) and Hgreen,m(x,y,z) of the images were obtained by selecting the purple and the green zones of Fig. 2(c). Reconstruction was made with z = 26.7 (a), z = 0 (b) and z = +26.7 μm (c). The displayed images (a..c) correspond to image 77 of file Visualization 1, Visualization 2 and Visualization 3. Dorsal side left, anterior to the top. Bar is 100 μm.
Fig. 4
Fig. 4 Averaged intensity reconstructed images made with 2 phases holograms: H2φ,m = Im − Im−1 and with ω1 = ω2. The purple and green components 〈|Hpurple(x,y,z)|2〉 and 〈|Hgreen(x,y,z)|2〉 are calculated with Eq. (8). Reconstruction was made with z = 0 (a), z = +26.7 (b) and z = +53.5 μm (c). The displayed images correspond to image 51 (a), 76 (b) and 101 (c) of file Visualization 4. Bar is 100 μm.
Fig. 5
Fig. 5 Scheme of the 3D reconstruction process by the cleaning algorithm. In (a) the product between the two 3D grids is calculated. This allows to select, in (b), the point of highest correlation x1;y1;z1. Thus, in (c), the point is stored in Spurple(x,y,z) and Sgreen(x,y,z) and erased from |Hpurple(x,y,z1)| and |Hgreen(x,y,z1)|. From this modified plane, in (d) the overall grid is recalculated. In (e) a new maximum correlation point x2;y2;z2 is found. Again, in (f), new sources are stored in the associated space of sources and set to zero in the plane z = z2 of the 3D grid. A new cycle will start and the operation will be repeated K times.
Fig. 6
Fig. 6 Results of the 3D reconstruction made without (a) and with (b) the cleaning algorithm. (a) cuts of 〈|Hpurple(x,y,z)|2〉 and 〈|Hgreen(x,y,z)|2〉 along the planes XY (upper-right image highlighted in blue), YZ (upper-left, red) and XZ (lower, yellow). (b) cuts of 〈|Spurple,θ|2〉 and 〈|Sgreen,θ|2〉 along the planes XY, YZ and XZ. In (a) and (b) the cuts correspond respectively to images 64 of the video file Visualization 5 and Visualization 6. The blue, red and yellow lines represent the relative positions of the planes XY, YZ and XZ.
Fig. 7
Fig. 7 Instantaneous projections |Spurple,θ|2 and |Sgreen,θ|2. The projections correspond to images 1, 32 and 46 of the video file Visualization 7, i.e. to time m = 1 and angle θ = 30° (a), m = 32 and θ = 0° (b), and m = 46 and θ = +15° (c). The display is made in arbitrary logarithmic scale: |Spurple,θ|2 is displayed in purple, |Sgreen,θ|2 in green. Bar is 100 μm.
Fig. 8
Fig. 8 Averaged projection 〈|Spurple,θ|2〉 and 〈|Sgreen,θ|2〉. The projections correspond to images: 1, 41 and 61 of the video file Visualization 8 i.e. to time m = 1 and angle θ = 40° (a), m = 1 and θ = 0° (b), and θ = +20° (c). The display is made in arbitrary logarithmic scale: 〈|Spurple,θ|2〉 is displayed in purple, 〉|Sgreen,θ|2〉 in green. Bar is 100 μm.
Fig. 9
Fig. 9 Averaged projections 〈|Spurple,θ|2〉 and 〈|Sgreen,θ|2〉 superimposed with the instantaneous projections |Spurple,θ|2 and |Sgreen,θ|2. The projections correspond to images 1, 32 and 46 of file Visualization 9, i.e. to time index m = 1 and projection angle θ = −30° (a), m = 32 and θ = 0° (b), m = 46 and θ = +15° (c). The display is made in arbitrary logarithmic scale: 〈|Spurple,θ|2〉 is displayed in purple, 〉|Sgreen,θ|2〉 in green. Bar is 100 μm.

Equations (15)

Equations on this page are rendered with MathJax. Learn more.

ω L O = ω I + ω 1 ω 2
ω L O = ω I + ω C A M / 4 H 4 φ = [ I 0 I 2 ] + j [ I 1 I 3 ]
H ˜ 4 φ ( k x , k y ) = FFT [ H 4 φ ( x , y ) e j k ( x 2 + y 2 ) / 2 d ]
ω L O = ω I H 2 φ = I 0 I 1
H ˜ 2 φ ( k x , k y ) = FFT [ e j k ( x 2 + y 2 ) / 2 d H 2 φ ( x , y ) ]
H purple ( x , y ) = FFT 1 H ˜ purple ( k x , k y ) H green ( x , y ) = FFT 1 H ˜ green ( k x , k y )
H purple ( x , y , z ) = FFT 1 e j ( k x 2 + k y 2 ) z / 2 k H ˜ purple ( k x , k y ) H green ( x , y , z ) = FFT 1 e j ( k x 2 + k y 2 ) z / 2 k H ˜ green ( k x , k y )
| H purple ( x , y , z ) | 2 = ( 1 / 128 ) m = 0 127 | H purple , m ( x , y , z ) | 2
| H green ( x , y , z ) | 2 = ( 1 / 128 ) m = 0 127 | H green , m ( x , y , z ) | 2
S purple ( x , y , z ) = k = 1 K S purple , k ( x , y , z ) S green ( x , y , z ) = k = 1 K S green , k ( x , y , z )
H purple ( x , y , z 1 ) = H purple , 1 ( x , y , z 1 ) + S purple , 1 ( x , y , z 1 ) H green ( x , y , z 1 ) = H green , 1 ( x , y , z 1 ) + S green , 1 ( x , y , z 1 )
H purple , 1 ( x , y , z 2 ) = H purple , 2 ( x , y , z 2 ) + S purple , 2 ( x , y , z 2 ) H green , 1 ( x , y , z 2 ) = H green , 2 ( x , y , z 2 ) + S green , 2 ( x , y , z 2 )
H purple , k 1 ( x , y , z k ) = H purple , k ( x , y , z k ) + S purple , k ( x , y , z k ) H green , k 1 ( x , y , z k ) = H green , k ( x , y , z 2 ) + S green , k ( x , y , z k )
| S purple , θ ( x , y , t m ) | 2 = z | S purple ( x cos θ + z sin θ , y , z , t m ) | 2 | S green , θ ( x , y , t m ) | 2 = z | S green ( x cos θ + z sin θ , y , z , t m ) | 2
| S purple , θ ( x , y ) | 2 = 1 122 m = 1 122 | S purple , θ ( x , y , t m ) | 2 | S green , θ ( x , y ) | 2 = 1 122 m = 1 122 | S green , θ ( x , y , t m ) | 2
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.