Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Spatio-spectral characterization of broadband fields using multispectral imaging

Open Access Open Access

Abstract

Spatio-temporal coupling in the field of ultrashort optical pulses is a technical enabler for applications but can result in detrimental effects such as increased on-target pulse duration and decreased intensity. Spectrally resolved spatial-phase measurements of a broadband field are demonstrated using a custom multispectral camera combined with two different wavefront sensors: a multilateral spatial shearing interferometer based on an amplitude checkerboard mask and an apodized imaged Hartmann sensor. The spatially and spectrally resolved phase is processed to quantify the commonly occurring pulse-front tilt and radial group delay, which are experimentally found to be in good agreement with models.

© 2018 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

1. Introduction

The spatially extended beam of ultrashort optical pulses experiences spectrally dependent effects that can be used in a variety of applications such as pulse stretching and compression in chirped-pulse amplification (CPA) [1], pulse shaping [2], optimization of nonlinear interactions [3–5], and the generation of light distributions with controllable group velocity in the transverse or longitudinal direction [6–8]. Nonuniform spectral or temporal properties that arise from the generation, amplification, and manipulation of optical pulses can, however, be detrimental [9–12], for example, by increasing the overall duration of an optical pulse focused on target. The most common examples of spatiotemporal coupling include pulse-front tilt (PFT) and radial group delay (RGD), which correspond to spatial variations of the time of arrival of the pulse with respect to a plane perpendicular to the average wave vector, i.e., phase components linear with respect to the optical frequency. These are typically related to angular dispersion and chromatic defocus, respectively. Both effects can arise in ultrafast laser systems, for example, because of compressor misalignment in a CPA system [10], phase-matching conditions in parametric amplifiers [12], or chromatic aberrations from imaging systems [11].

A multitude of pulse characterization techniques exist [13], but most of them are not designed or implemented to characterize spatial variations in the temporal properties of an ultrashort optical pulse. In the large majority of experimental cases, spatiotemporal coupling can be characterized with a linear technique that measures the relative spatial variations of the field without fully characterizing the field at any given point [14]. The phase reconstructed by most spatiotemporal diagnostics is φ(x,y,ω) + ψ(ω) [14–19]. ψ is an unknown spectrally dependent spatially uniform phase that can be determined, if needed, by measuring the spectral phase at a particular point in the beam with a self-referencing technique or interfering the field with a known reference. Measurement of spatiotemporal coupling typically requires the combination of spatial resolution, e.g., from a camera or an optical fiber scanned across the beam to be characterized, and spectral resolution, e.g., from spectral filtering, a grating spectrometer, or a Fourier-transform spectrometer. One strategy to measure the spatially and spectrally dependent phase φ is to combine a wavefront sensor with a frequency-resolving device, for example by performing wavefront measurements after a spectrometer or spectrally filtering the field before the wavefront sensor [20–22]. The techniques demonstrated so far only return the wavefront as function of one spatial variable [20] or cannot be operated in single-shot because the wavefront must be successively measured at different wavelengths [21,22].

We demonstrate the use of a multispectral camera to measure spatiotemporal coupling in a spectrally broadband field. The multispectral camera simultaneously resolves the incident fluence in the spatial and spectral domains using an array of spectrally selective filters over its pixelated chip. When used as the detection camera in a wavefront sensor, it allows for the simultaneous direct measurement of experimental traces at different optical frequencies ω1, ω2,… ωK, therefore leading to determinations of the wavefront at those frequencies, i.e., the function φ(x,y,ωj), up to an arbitrary frequency-dependent function. Wavelength-resolved aberration measurements of the human eye have been performed using three monochromatic sources and a standard red-green-blue (RGB) camera, where the spectral resolution was provided by the illumination test sources because of the broad passband and significant spectral overlap of the wavelength bands [23]. Our work, based on a custom commercial camera, demonstrates the important possibility to perform single-shot spatiospectral characterization on ultrafast laser systems with adequate spectral resolution. Section 2 presents the general strategy and processing for this diagnostic. Section 3 presents an experimental demonstration performed with two different types of wavefront sensors: a Hartmann sensor based on an array of apodized holes [24] and a multiwave lateral shearing interferometer based on an amplitude checkerboard mask [25]. Both techniques lead to determinations of the pulse-front tilt and radial group delay that are in very good agreement with models. Section 4 discusses possible technical improvements and extensions of the demonstrated technique.

2. Principle

2.1 Multispectral camera

Multispectral imagers can capture an image in a small number of wavelength bands, while hyperspectral imagers have a much higher spectral resolution and number of resolved bands. These imagers are used in many applications such as environmental monitoring, forensic science, and biological analysis [26]. There are many technological solutions to implement a multispectral imager, but the use of a multispectral filter array over the chip of a monochrome camera is particularly interesting, owing to the simplicity and ability to perform single-shot measurement in a common-path configuration for all wavelengths [27]. Color cameras based on the RGB Bayer filter arrangement over a monochrome chip are ubiquitous [Fig. 1(a)], but they have very low spectral resolutions and are not spectrally adapted to common broadband laser systems. One possible arrangement for a custom multispectral camera, depicted in Fig. 1(b), is a k × k array of spectral filters centered at optical wavelengths λ1, λ2,…λK (K = k2) repeated in two dimensions across the camera chip. A single camera-frame acquisition determines the image fluence, sampled every k pixels, in each frequency band defined by the filters. Figure 1(c) corresponds to the layout of the camera that has been used for experiments; it will be described in more detail in Sec. 3.1.

 figure: Fig. 1

Fig. 1 Examples of filter arrangements for detecting spectrally resolved fluence distributions with a monochrome camera. The arrangements correspond to (a) the Bayer filter arrangement commonly used in a color red-green-blue (RGB) camera; (b) a generic k × k filter arrangement, with k = 3; and (c) the filter arrangement for the multispectral camera used in this demonstration. The fluence in gray pixels corresponds to transitions between filter pixels and has not been used. Only the filters at λ1 = 800 nm, λ2 = 865 nm, and λ3 = 930 nm overlap with the spectral density of the source used for this demonstration.

Download Full Size | PDF

2.2 Image processing

Processing the image measured by a multispectral camera requires splitting the image into its frequency-dependent sub-images (a process commonly referred to as “demosaicing”), then applying the wavefront-reconstruction algorithm to each sub-image. For example, the collection of centroids for the spots generated by the holes of a Hartmann wavefront sensor measured in the frame corresponding to λj is integrated to yield the wavefront φ(x,y,λj). Because the wavefront is determined up to a constant piston term at each wavelength, one cannot determine the spectrally dependent ψ(ω), as is expected from a technique based on linear time-stationary components [13].

Coupling between the different measured camera frames, i.e., cross contributions from different bands, can occur for various technical reasons. For example, there can be a spectral overlap between the different filters, or light after a given filter pixel might be incident on more than one camera pixel because of diffraction or misalignment of the filter array relative to the camera array. Coupling between bands might have a minor effect for qualitative applications of multispectral cameras, but it should be addressed for quantitative wavefront measurements since it is expected to lead to an underestimation of spatiotemporal coupling. The camera used for this demonstration was calibrated by placing it into a spectrophotometer and measuring the camera signal in each wavelength band as a function of the spectrophotometer wavelength. Because there are K bands corresponding to K wavelengths, this process results in a square K × K matrix P that links the measured signal M to the actual signal A:

(M1M2MK)=(P11P12P1KP21P22PK1PKK)(A1A2AK).
For example, when there is only light at λ1, only the A1 coefficient is nonzero, and the signal measured in frame j is Mj = Pj1A1. In the ideal case where there is no coupling between measured frames, the matrix P is diagonal. For a camera with adequate discrimination between wavelength bands, the diagonal coefficients of P are much larger than the nondiagonal elements. This results in the determinant of P to be nonzero, which implies that P can be inverted. The inverse matrix P–1 is then used to decouple the wavelength frames for each acquisition of the multispectral camera, which means that the bias on the measured values Mj introduced by the coupling between different bands is removed and the actual spectral properties of the input source Aj are determined. The matrix P–1 can be directly applied to the fluence measured at all pixels in a particular primary group of k × k pixels. For example, for k = 2, the actual fluence in band j for the group of 2 × 2 pixels identified by the indices (n,m) is calculated using the four measured fluences in that group [Fig. 2(a)], following:

 figure: Fig. 2

Fig. 2 Application of decoupling matrix for a 2 × 2 multispectral camera using (a) only data in a particular 2 × 2 group and (b) data from all adjacent pixels.

Download Full Size | PDF

Aj(n,m)=l=14Pjl1Ml(n,m).

A more-symmetric decoupling operation uses fluences measured in adjacent 2 × 2 pixel groups [Fig. 2(b)], for example:

A2(n,m)=P221M2(n,m)+P241[M4(n,m)+M4(n,m+1)]/2+P211[M1(n,m)+M1(n+1,m)]/2+P231[M3(n,m)+M3(n,m+1)+M3(n+1,m)+M3(n+1,m+1)]/4.

Similar expressions can be written for A1, A3, and A4.

2.3 Processing of spectrally resolved wavefronts

For a large number of resolved wavelengths K, the spectral field determined from the measured wavefronts and fluences can be Fourier transformed to the time domain to express the spatiotemporal field. This is, however, not necessary to quantify common spatiotemporal coupling effects described by low polynomial orders in the frequency ω, in particular PFT and RGD. The spatiospectral phase φ(x,y,ω) including achromatic terms can be expressed as

φ(x,y,ω)=(pxx+pyy+qr2)(ωω0)+2πλ[txx+tyy+12Rr2+Q(x,y)].

The achromatic terms include tilts in the x and y directions (tx,ty), curvature (1/2R), and other terms [Q(x,y)]. These terms must be removed by subtracting two measurements at the wavelengths λi and λj. We define

Tij(x,y)λ0c[λi2πφ(x,y,λi)λj2πφ(x,y,λj)λiλj]=pxx+pyy+qr2.

Tij corresponds to the geometrical definition of group delay when λi approaches λj, in which case T = –(λ0/c) × (ΔOPD/Δλ), where ΔOPD is the variation in optical path difference between rays at two wavelengths separated by Δλ [28]. It is only required to measure the wavefront at two different wavelengths to determine the spatially resolved group delay averaged over the corresponding wavelength interval. When phase measurements are obtained at K different wavelengths, there are K(K–1)/2 combinations for which the calculation presented by Eq. (5) can be performed. More-accurate estimates of the group-delay coefficients px, py, and q across the pulse can be obtained as averages of the polynomial coefficients determined by fitting each Tij, but these can also be used to determine the variations of the group delay as a function of the optical frequency, i.e., spectral terms of polynomial orders equal to 2 and higher.

3. Experimental demonstration

3.1. Setup

Experimental demonstration has been performed using the setup shown in Fig. 3(a). A broadband spatially coherent field distribution is generated by the combination of two fiber-coupled superluminescent light-emitting diodes (SLED’s) [spectrum shown in Fig. 3(b)] after collimation by an off-axis spherical mirror and apodization to an ~1-cm beam. This beam is free of spatiotemporal coupling because the setup is achromatic up to that point. Controlled amounts of RGD and PFT can be introduced by propagation in an imaging system composed of singlet lenses and a wedge, respectively. The RGD introduced by the two fused-silica lenses (radius of curvature ρ = 51.5 mm) is twice that introduced by a single lens and is characterized by the coefficient

q=λcρnλ,
where c is the speed of light and ∂n/∂λ is the dispersion at the central wavelength of the source, i.e., q = –0.85 fs/mm2. The calculated coefficient for the PFT magnitude introduced by the fused-silica wedge (angle α) is
p=nλλtan(α)c.
This leads to p = 0.74 fs/mm for α = 1° and p = 2.2 fs/mm for α = 2.9°. Rotation of the wedge around the beam-propagation axis introduces angular dispersion in any direction, leading to PFT of known magnitude and controllable direction.

 figure: Fig. 3

Fig. 3 (a) Setup for experimental demonstration with an apodized imaged Hartmann sensor. The multiwave lateral shearing interferometer can be located where the Hartmann mask is represented. (b) Spectral density of the source and center wavelength of the three multispectral camera’s filters.

Download Full Size | PDF

The multispectral camera is a four-band camera designed for biomedical applications (Spectral Devices, MSC-BIO-1-A) [29]. It has four wavelength bands centered at 730 nm, 800 nm, 865 nm, and 930 nm. As shown on Fig. 3(b), only the latter three bands overlap with the spectral density of the source; consequently, these bands were indexed 1, 2, and 3 while the unused band has the index 0. The filters have a 5-nm bandwidth and are arranged in a 2 × 2 set [Fig. 1(c)]. For this particular camera, the filter-pixel size (11 μm) is twice that of the camera-pixel size (5.5 μm). The filter array is aligned so that pixel centers in the filter array match pixel centers in the camera array every 11 μm, but transitions between filter pixels and contributions from adjacent filters essentially render the signal at other pixels nonusable. Note that this is not a fundamental limitation of multispectral cameras, which are commercially available with a one-to-one correspondence between filter pixels and camera pixels. For this particular camera chip, which has 2048 × 2048 pixels, four 512 × 512 wavelength-resolved frames are obtained. The coupling matrix P was determined by measuring the signal in each wavelength band as a function of illumination wavelength, the camera being located inside a spectrophotometer. The nondiagonal elements were of the order of 5% of the diagonal elements. Frames were decoupled using expressions similar to that given by Eq. (3).

3.2 Demonstration with an apodized imaged Hartmann wavefront sensor

The multispectral camera has been combined with an apodized imaged Hartmann wavefront sensor [24]. This sensor operates along the principle of a mask-based Hartmann sensor, in which transparent holes in an opaque medium generate beamlets in directions that depend on the local wavefront slope. In a conventional Hartmann sensor, photodetection is performed directly after the mask with a camera, and the centroids of the spots are used to determine the local wavefront slope in two directions, from which the wavefront can be reconstructed [30]. In this demonstration, we used an achromatic imaging system (two identical achromats with a focal length equal to 20 cm) after the Hartmann mask to image the mask onto the multispectral camera. The main advantage of this setup is that it allows one to modify the longitudinal propagation distance δz simply by translation of the camera or the mask. It also allows one to measure a reference calibration frame, for which the mask is exactly imaged on the camera (δz = 0), or to measure two frames corresponding to two nonzero longitudinal distances δz1 and δz2 between mask image and detection plane, allowing for wavefront reconstruction using δz = δz2δz1. Apodization of the holes using spatially dithered binary pixel distributions was demonstrated as beneficial for accuracy [24].

In this work, the holes correspond to a fourth-order super-Gaussian profile, with a full width at half maximum equal to 240 μm, synthesized with 2.5-μm pixels, and are distributed with a 500-μm period in both transverse directions. Two frames were measured by the multispectral camera for two different longitudinal mask positions separated by 25 mm. Each frame was demosaiced, decoupled, and then processed to yield the centroids corresponding to each mask hole. Figure 4(a) shows a close-up of the camera frame around one mask hole, while Figs. 4(b)–4(d) show the demosaiced camera frame at 865 nm for three different longitudinal positions of the mask when it is moved away from the object plane. The relative displacement observed between the two frames yields a map of wavefront slopes in the two transverse directions at each wavelength, from which the wavefront can be reconstructed, up to a constant phase. The spectrally resolved wavefronts can then be combined to yield information such as group delay versus position, which can be fitted to obtain the PFT and RGD. This demonstration performed with this particular experimental implementation indicates that the same approach could be used with other Hartmann-like techniques, such as the standard (nonimaged) Hartmann sensor [31] and the lenslet-based Shack–Hartmann sensor [32].

 figure: Fig. 4

Fig. 4 (a) Close-up of a full camera frame around one mask hole with the mask at the object plane. [(b)–(d)] Close-up of demosaiced frames at 865 nm around one mask hole for three different longitudinal locations of the mask, corresponding to a total displacement of 50 mm away from the object plane.

Download Full Size | PDF

The wavefronts measured at λ1, λ2, and λ3 without propagating in the singlet imaging system and wedge show the astigmatism expected from the off-axis collimation mirror (Fig. 5). Propagation in the two-lens imaging system results in an additive frequency-dependent spatially quadratic wavefront [Figs. 6(a)–6(c)]. Combining two of these wavefronts leads to the group delay as a function of x and y [Figs. 6(d) and 6(e)]. As expected, the fielddistribution has a radially dependent group delay. The radial group-delay coefficient is found to be −0.82 fs/mm2 and –0.81 fs/mm2 when fitting data obtained by combinations of data at (λ1,λ2) and (λ2,λ3), respectively. These values are in good agreement with the expected value of −0.85 fs/mm2. When the beam propagates into the wedge, it acquires a wavelength-dependent tilt [Figs. 7(a)–7(c)] in a direction that depends on the wedge orientation. The spatially resolved group delay is a linear function of the spatial variables (x,y), from which the magnitude and angle of p relative to the axes can be determined [Figs. 7(d)–7(f)]. As shown in Fig. 7(g), rotation of the wedge along the propagation axis of the beam changes the angle of p but not its magnitude. The average magnitude of p, retrieved over eight different orientations of the wedge corresponding to relative rotations of 45° is 2.04 fs/mm, which is in reasonable agreement with the expected value (2.20 fs/mm). The pulse-front tilt introduced by a 1° wedge was identically characterized and led to similar absolute discrepancies. Noting that the measured coefficient is smaller than the expected coefficient, identically to what was observed when quantifying radial group delay, the discrepancy might be as a result of residual coupling between different frames. It could also be caused by propagation effects in the achromatic optical system that images the Hartmann mask on the multispectral camera. While this imaging system is convenient for calibration purposes, implementation of a Hartmann-based wavefront sensor with a mask close to the photodetection plane would remove potential issues introduced by the imaging system, while making the device more compact.

 figure: Fig. 5

Fig. 5 Experimental characterization of the field after the off-axis spherical mirror. [(a)–(c)] Wavefronts measured at λ1, λ2, and λ3, respectively.

Download Full Size | PDF

 figure: Fig. 6

Fig. 6 Experimental characterization of the field after propagation in the two-singlet imaging system introducing RGD. [(a)–(c)] Wavefronts measured at λ1, λ2, and λ3, respectively; [(d) and (e)] spatially resolved group delay calculated by combining data at λ1 and λ2 and data at λ2 and λ3, respectively, using Eq. (5).

Download Full Size | PDF

 figure: Fig. 7

Fig. 7 Experimental characterization of the field after propagation in a 2.9° fused-silica wedge. [(a)–(c)] Wavefronts measured at λ1, λ2, and λ3, respectively, for a wedge introducing angular dispersion in the x direction. [(d)–(f)] Spatially resolved group delay for three different orientations of the wedge using Eq. (5). (g) Coefficients of the linear fit of the spatially resolved group delay versus x and y for eight different orientations of the wedge (markers) and expected value for the PFT (red line).

Download Full Size | PDF

3.3 Demonstration with a multiwave lateral shearing wavefront sensor

Spatiospectral characterization was also demonstrated with a multiwave lateral shearing wavefront sensor [25,33]. An incident field being transmitted through a checkerboard amplitude mask mounted in front of a camera diffracts into multiple replicas of the same field, which are laterally sheared with respect to each other after propagation away from the mask. Fourier analysis of the resulting interferogram produces phase derivative maps in the two orthogonal transverse directions, from which the wavefront is reconstructed [30]. It is straightforward to extend this scheme to multispectral wavefront sensing by using a multispectral imager instead of a regular monochrome camera.

An example of a raw image is shown in Fig. 8(a), where a checkerboard mask with period p = 240-μm is used. The raw image is decoupled into its constituent spectral components [Figs. 8(b)–8(d)]. Because of the Talbot effect, the interference contrast is not uniformly retained over the entire bandwidth. The contrast is controlled by adjusting the distance between the mask and the sensor Z. The distance is set such that the best contrast is obtained at the center wavelength (865 nm), whereas the data at 800 and 930 nm show signs of degraded contrast. Despite the contrast degradation, phase gradients are extracted with sufficient accuracy because phase information is encoded in the location of the features. The spatial resolution in one transverse direction is given by the spatial shear between orders generated by the grating in the detection plane [25], which is equal to /p = 245 μm. As in the case of the Hartmann sensor, wavefronts measured at different wavelengths were obtained and combined to yield the spatially resolved group delay in the field distribution, from which PFT and RGD can be obtained via fitting. We show only a subset of the obtained results, owing to their similarity with those presented in the previous subsection.

 figure: Fig. 8

Fig. 8 (a) Close-up of a full camera frame; [(b)–(d)] corresponding decoupled frames at 800 nm, 865 nm, and 930 nm.

Download Full Size | PDF

Figure 9(a) shows lineouts of the measured wavefronts at the center of the beam after propagation in the imaging system. As expected, the wavefronts at the different wavelengths differ by a quadratic term. The averaged spatially resolved group delay obtained from the wavefronts measured at λ1 and λ2 following Eq. (5) is parabolic, as expected [Fig. 9(b)]. The deduced RGD from the quadratic coefficients is –0.81 fs/mm2 using Eq. (5). It agrees with the expected value (–0.85 fs/mm2) within 5%. The PFT introduced by a 1° fused-silica wedge has been measured [Fig. 9(c)]. It is rotated on its axis to introduce pulse-front tilt in an arbitrary direction with the fixed magnitude of 0.74 fs/mm (red line). The measurements were done for eight orientations of the wedge (black markers). The average measured PFT coefficient, 0.75 fs/mm, is in excellent agreement with the expected value. These excellent results demonstrate that combining a multispectral camera with a multiwave lateral shearing interferometer is a viable approach to spatiospectral characterization.

 figure: Fig. 9

Fig. 9 Experimental characterization of the field after propagation in the two-singlet imaging system and a 1° fused-silica wedge. (a) Wavefront lineouts at λ1, λ2, and λ3 after removal of linear terms. (b) Spatially resolved group delay after the two-lens imaging system calculated by combining data at λ1 and λ2 using Eq. (5). (c) Coefficients of the linear fit of the spatially resolved group delay versus x and y, for eight different orientations of the 1° wedge (markers), and expected value for the PFT (red line).

Download Full Size | PDF

4. Discussion

Spatiospectral characterization using a wavefront sensor combined with a multispectral camera, as demonstrated here, has several practical advantages and can be extended in a variety of ways. A multispectral camera can be used to directly measure the fluence of a field distribution in different wavelength bands. This is a useful tool when working with broadband fields, for example, to diagnose wavelength-dependent beam profiles. In wavefront sensors such as the ones used for this demonstration, the local signal is proportional to the beam fluence; consequently, the wavelength-resolved beam profile is obtained simultaneously with the wavelength-resolved wavefront. This enables one to characterize more-complex spatiotemporal fields with spatially varying spectral density. Other wavefront sensors can be combined with a multispectral camera. The demonstrations presented in Secs. 2 and 3 show that this approach should be applicable with Hartmann sensors, Shack–Hartmann sensors, and various multiwave shearing interferometers. The use of a multispectral camera with other wavefront-metrology strategies, such as optical differentiation wavefront sensing [34] and phase diversity [35], is most likely possible but requires further studies. Because all these wavefront sensors are self-referenced and do not require a mutually coherent reference pulse, either provided as an independent pulse or generated from the test pulse itself, such a spatiospectral diagnostic is straightforward to implement. Because there is no need for spatial, spectral, or temporal scanning, single-shot operation is straightforward as long as a single acquisition is needed to reconstruct the frequency-resolved wavefront and an accurate a priori calibration can be performed if needed—for example, using a collimated or reference spherical source [36]. This enables one to characterize low-repetition-rate laser systems for which techniques requiring multiple acquisitions are not practical and can be significantly impaired by shot-to-shot variations. Single-shot operation also allows for the characterization of on-shot effects at full energy, such as self-phase modulation of a high-energy beam and spatiotemporal coupling resulting from heating of a diffraction grating by a high-average-power beam.

The number of measurement spectral bands can be increased. For example, multispectral cameras with a 4 × 4 arrangement of filters are commercially available, spanning a range of wavelengths that encompasses the typical spectrum of broadband Ti:sapphire laser systems [37] and DKDP-based optical parametric chirped-pulse–amplification systems [38]. While only two wavelengths are required to quantify the average spatially resolved group delay in the field, from which pulse-front tilt and radial group delay can be derived, higher sampling density in the wavelength domain enables one to characterize higher-order variations in group delay, e.g., terms in ω2 in the spatio-spectral phase φ(x,y,ω). Finer sampling in the spectral domain, and identically in the time domain, allow for the description of more complex fields [39].

Wavelength-resolved digital holography is a proven single-shot characterization technique that can sample the test field at more than 20 wavelengths [15]. Its estimated space–time-bandwidth product (equal to the number of independently resolved points in the spatiotemporal domain) is of the order of 105. Other single-shot techniques for spatiotemporal couplings characterization include multiple-slit spatiotemporal interferometry [40] and wavelength-resolved far-field mapping [41], but these are limited to low-order variations of the spatiospectral phase. With the current four-wavelength multispectral camera and a Hartmann mask sampling the entire 11.3-mm-square camera chip every 0.5 mm, the calculated space–time-bandwidth product is of the order of 2000 [4 × (11.3/0.5) × (11.3/0.5)]. When using the camera with the spatial shearing interferometer, the number of resolved spatial points in one direction is the ratio of the chip size to the shear, and the calculated space–time-bandwidth product is of the order of 8500 [4 × (11.3/0.245) × (11.3/0.245)]. A multispectral camera with a larger number of wavelength bands would allow for simple increase in the space–time-bandwidth product. The feature size in the Hartmann mask and the period of the checkerboard mask were chosen in this experimental demonstration to accommodate the lower sampling obtained with the pixel arrangement shown in Fig. 1(c). A 1-to-1 correspondence between filter pixels and camera pixels would allow for a fourfold increase in space–time-bandwidth product using features sizes reduced by a factor 2.

5. Conclusions

We have demonstrated the characterization of frequency-resolved wavefronts and spatiotemporal coupling using the combination of a custom multispectral camera and two types of wavefront sensors: an apodized imaged Hartmann sensor and a multiwave lateral shearing interferometer. Pulse-front tilt and radial group delay have been experimentally determined and are in good agreement with expected values for wedges and an imaging system built with singlet lenses. This strategy for spatiotemporal coupling characterization is particularly well adapted to broadband laser systems since it does not require the generation of a reference, is intrinsically common path, and is capable of single-shot operation.

Funding

Department of Energy National Nuclear Security Administration under Award Number DE-NA0001944, University of Rochester, and the New York State Energy Research and Development Authority.

Acknowledgment

The authors thank J. Carson and M. Najiminaini, from Spectral Devices Inc., for fruitful discussion about the multispectral camera and providing the spectrophotometer camera calibration data.

Disclosures

The support of DOE does not constitute an endorsement by DOE of the views expressed in this article. This report was prepared as an account of work sponsored by an agency of the U.S. Government. Neither the U.S. Government nor any agency thereof, nor any of their employees, makes any warranty, express or implied, or assumes any legal liability or responsibility for the accuracy, completeness, or usefulness of any information, apparatus, product, or process disclosed, or represents that its use would not infringe privately owned rights. Reference herein to any specific commercial product, process, or service by trade name, trademark, manufacturer, or otherwise does not necessarily constitute or imply its endorsement, recommendation, or favoring by the U.S. Government or any agency thereof. The views and opinions of authors expressed herein do not necessarily state or reflect those of the U.S. Government or any agency thereof.

References

1. I. Walmsley, L. Waxer, and C. Dorrer, “The role of dispersion in ultrafast optics,” Rev. Sci. Instrum. 72(1), 1–29 (2001). [CrossRef]  

2. A. M. Weiner, “Femtosecond pulse shaping using spatial light modulators,” Rev. Sci. Instrum. 71(5), 1929–1960 (2000). [CrossRef]  

3. O. E. Martinez, “Achromatic phase matching for second harmonic generation of femtosecond pulses,” IEEE J. Quantum Electron. 25(12), 2464–2468 (1989). [CrossRef]  

4. H. Vincenti and F. Quéré, “Attosecond lighthouses: How to use spatiotemporally coupled light fields to generate isolated attosecond pulses,” Phys. Rev. Lett. 108(11), 113904 (2012). [CrossRef]   [PubMed]  

5. G. Zhu, J. van Howe, M. Durst, W. Zipfel, and C. Xu, “Simultaneous spatial and temporal focusing of femtosecond pulses,” Opt. Express 13(6), 2153–2159 (2005). [CrossRef]   [PubMed]  

6. J.-C. Chanteloup, E. Salmon, C. Sauteret, A. Migus, P. Zeitoun, A. Klisnick, A. Carillon, S. Hubert, D. Ros, P. Nickles, and M. Kalachnikov, “Pulse-front control of 15-TW pulses with a tilted compressor, and application to the subpicosecond traveling-wave pumping of a soft x-ray laser,” J. Opt. Soc. Am. B 17(1), 151–157 (2000). [CrossRef]  

7. A. Sainte-Marie, O. Gobert, and F. Quéré, “Controlling the velocity of ultrashort light pulses in vacuum through spatio-temporal couplings,” Optica 4(10), 1298–1304 (2017). [CrossRef]  

8. D. H. Froula, D. Turnbull, A. S. Davies, T. J. Kessler, D. Haberberger, J. P. Palastro, S.-W. Bahk, I. A. Begishev, R. Boni, S. Bucht, J. Katz, and J. L. Shaw, “Spatiotemporal control of laser intensity,” Nat. Photonics 12(5), 262–265 (2018). [CrossRef]  

9. M. M. Wefers and K. A. Nelson, “Space-time profiles of shaped ultrafast optical waveforms,” IEEE J. Quantum Electron. 32(1), 161–172 (1996). [CrossRef]  

10. C. Fiorini, C. Sauteret, C. Rouyer, N. Blanchot, S. Seznec, and A. Migus, “Temporal aberrations due to misalignments of a stretcher-compressor system and compensation,” IEEE J. Quantum Electron. 30(7), 1662–1670 (1994). [CrossRef]  

11. Z. Bor, “Distortion of femtosecond laser pulses in lenses,” Opt. Lett. 14(2), 119–121 (1989). [CrossRef]   [PubMed]  

12. J. Bromage, C. Dorrer, and J. D. Zuegel, “Angular-dispersion-induced spatiotemporal aberrations in noncollinear optical parametric amplifiers,” Opt. Lett. 35(13), 2251–2253 (2010). [CrossRef]   [PubMed]  

13. I. A. Walmsley and C. Dorrer, “Characterization of ultrashort electromagnetic pulses,” Adv. Opt. Photonics 1(2), 308–437 (2009). [CrossRef]  

14. C. Dorrer and I. A. Walmsley, “Simple linear technique for the measurement of space-time coupling in ultrashort optical pulses,” Opt. Lett. 27(21), 1947–1949 (2002). [CrossRef]   [PubMed]  

15. P. Gabolde and R. Trebino, “Single-shot measurement of the full spatio-temporal field of ultrashort pulses with multi-spectral digital holography,” Opt. Express 14(23), 11460–11467 (2006). [CrossRef]   [PubMed]  

16. P. Bowlan, P. Gabolde, M. A. Coughlan, R. Trebino, and R. J. Levis, “Measuring the spatiotemporal electric field of ultrashort pulses with high spatial and spectral resolution,” J. Opt. Soc. Am. B 25(6), A81–A92 (2008). [CrossRef]  

17. G. Pariente, V. Gallet, A. Borot, O. Gobert, and F. Quéré, “Space–time characterization of ultra-intense femtosecond laser beams,” Nat. Photonics 10(8), 547–553 (2016). [CrossRef]  

18. S.-W. Bahk, C. Dorrer, R. G. Roides, and J. Bromage, “Chromatic-aberration diagnostic based on a spectrally resolved lateral-shearing interferometer,” Appl. Opt. 55(9), 2413–2417 (2016). [CrossRef]   [PubMed]  

19. S.-W. Bahk, C. Dorrer, and J. Bromage, “Chromatic diversity: a new approach for characterizing spatiotemporal coupling of ultrashort pulses,” Opt. Express 26(7), 8767–8777 (2018). [CrossRef]   [PubMed]  

20. E. Rubino, D. Faccio, L. Tartara, P. K. Bates, O. Chalus, M. Clerici, F. Bonaretti, J. Biegert, and P. Di Trapani, “Spatiotemporal amplitude and phase retrieval of space-time coupled ultrashort pulses using the Shackled-FROG technique,” Opt. Lett. 34(24), 3854–3856 (2009). [CrossRef]   [PubMed]  

21. C. P. Hauri, J. Biegert, U. Keller, B. Schaefer, K. Mann, and G. Marowski, “Validity of wave-front reconstruction and propagation of ultrabroadband pulses measured with a Hartmann-Shack sensor,” Opt. Lett. 30(12), 1563–1565 (2005). [CrossRef]   [PubMed]  

22. S. L. Cousin, J. M. Bueno, N. Forget, D. R. Austin, and J. Biegert, “Three-dimensional spatiotemporal pulse characterization with an acousto-optic pulse shaper and a Hartmann-Shack wavefront sensor,” Opt. Lett. 37(15), 3291–3293 (2012). [CrossRef]   [PubMed]  

23. P. Jain and J. Schwiegerling, “RGB Shack–Hartmann wavefront sensor,” J. Mod. Opt. 55(4−5), 737–748 (2008). [CrossRef]  

24. C. Dorrer, A. Kalb, P. Fiala, S. W. Bahk, A. Sharma, and K. Gibney, “Investigation of an apodized imaged Hartmann wavefront sensor,” Appl. Opt. 57(25), 7266–7275 (2018). [CrossRef]   [PubMed]  

25. S.-W. Bahk and C. Dorrer, “Wavefront sensing using a checkerboard amplitude mask,” in Imaging and Applied Optics, OSA Technical Digest (online) (Optical Society of America, 2013), Paper CM3C.4.

26. M. J. Khan, H. S. Khan, A. Yousaf, K. Khurshid, and A. Abbas, “Modern trends in hyperspectral image analysis: A review,” IEEE Access 6, 14118–14129 (2018). [CrossRef]  

27. P.-J. Lapray, X. Wang, J.-B. Thomas, and P. Gouton, “Multispectral filter arrays: recent advances and practical implementation,” Sensors 14(11), 21626–21659 (2014). [CrossRef]   [PubMed]  

28. S.-W. Bahk, J. Bromage, and J. D. Zuegel, “Offner radial group delay compensator for ultra-broadband laser beam transport,” Opt. Lett. 39(4), 1081–1084 (2014). [CrossRef]   [PubMed]  

29. Spectral Devices Inc., London, Ontario, Canada, N6G 4X8.

30. W. H. Southwell, “Wave-front estimation from wave-front slope measurements,” J. Opt. Soc. Am. 70(8), 998–1009 (1980). [CrossRef]  

31. D. Malacara-Hernández and D. Malacara-Doblado, “What is a Hartmann test?” Appl. Opt. 54(9), 2296–2301 (2015). [CrossRef]   [PubMed]  

32. B. C. Platt and R. Shack, “History and principles of Shack-Hartmann wavefront sensing,” J. Refract. Surg. 17(5), S573–S577 (2001). [PubMed]  

33. J. Primot, “Three-wave lateral shearing interferometer,” Appl. Opt. 32(31), 6242–6249 (1993). [CrossRef]   [PubMed]  

34. J. Qiao, Z. Mulhollan, and C. Dorrer, “Optical differentiation wavefront sensing with binary pixelated transmission filters,” Opt. Express 24(9), 9266–9279 (2016). [CrossRef]   [PubMed]  

35. S.-W. Bahk, J. Bromage, I. A. Begishev, C. Mileham, C. Stoeckl, M. Storm, and J. D. Zuegel, “On-shot focal-spot characterization technique using phase retrieval,” Appl. Opt. 47(25), 4589–4597 (2008). [CrossRef]   [PubMed]  

36. A. Chernyshov, U. Sterr, F. Riehle, J. Helmcke, and J. Pfund, “Calibration of a Shack-Hartmann sensor for absolute measurements of wavefronts,” Appl. Opt. 44(30), 6419–6425 (2005). [CrossRef]   [PubMed]  

37. S. Backus, C. G. Durfee III, M. M. Murnane, and H. C. Kapteyn, “High power ultrafast lasers,” Rev. Sci. Instrum. 69(3), 1207–1223 (1998). [CrossRef]  

38. V. V. Lozhkarev, G. I. Freidman, V. N. Ginzburg, E. V. Katin, E. A. Khazanov, A. V. Kirsanov, G. A. Luchinin, A. N. Mal’shakov, M. A. Martyanov, O. V. Palashov, A. K. Poteomkin, A. M. Sergeev, A. A. Shaykin, and I. V. Yakovlev, “Compact 0.56 petawatt laser system based on optical parametric chirped pulse amplification in KD*P crystals,” Laser Phys. Lett. 4(6), 421–427 (2007). [CrossRef]  

39. Z. Li and N. Miyanaga, “Simulating ultra-intense femtosecond lasers in the 3-dimensional space-time domain,” Opt. Express 26(7), 8453–8469 (2018). [CrossRef]   [PubMed]  

40. Z. Li, N. Miyanaga, and J. Kawanaka, “Single-shot real-time detection technique for pulse-front tilt and curvature of femtosecond pulsed beams with multiple-slit spatiotemporal interferometry,” Opt. Lett. 43(13), 3156–3159 (2018). [CrossRef]   [PubMed]  

41. A. Börzsönyi, L. Mangin-Thro, G. Chériaux, and K. Osvay, “Two-dimensional single-shot measurement of angular dispersion for compressor alignment,” Opt. Lett. 38(4), 410–412 (2013). [CrossRef]   [PubMed]  

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (9)

Fig. 1
Fig. 1 Examples of filter arrangements for detecting spectrally resolved fluence distributions with a monochrome camera. The arrangements correspond to (a) the Bayer filter arrangement commonly used in a color red-green-blue (RGB) camera; (b) a generic k × k filter arrangement, with k = 3; and (c) the filter arrangement for the multispectral camera used in this demonstration. The fluence in gray pixels corresponds to transitions between filter pixels and has not been used. Only the filters at λ1 = 800 nm, λ2 = 865 nm, and λ3 = 930 nm overlap with the spectral density of the source used for this demonstration.
Fig. 2
Fig. 2 Application of decoupling matrix for a 2 × 2 multispectral camera using (a) only data in a particular 2 × 2 group and (b) data from all adjacent pixels.
Fig. 3
Fig. 3 (a) Setup for experimental demonstration with an apodized imaged Hartmann sensor. The multiwave lateral shearing interferometer can be located where the Hartmann mask is represented. (b) Spectral density of the source and center wavelength of the three multispectral camera’s filters.
Fig. 4
Fig. 4 (a) Close-up of a full camera frame around one mask hole with the mask at the object plane. [(b)–(d)] Close-up of demosaiced frames at 865 nm around one mask hole for three different longitudinal locations of the mask, corresponding to a total displacement of 50 mm away from the object plane.
Fig. 5
Fig. 5 Experimental characterization of the field after the off-axis spherical mirror. [(a)–(c)] Wavefronts measured at λ1, λ2, and λ3, respectively.
Fig. 6
Fig. 6 Experimental characterization of the field after propagation in the two-singlet imaging system introducing RGD. [(a)–(c)] Wavefronts measured at λ1, λ2, and λ3, respectively; [(d) and (e)] spatially resolved group delay calculated by combining data at λ1 and λ2 and data at λ2 and λ3, respectively, using Eq. (5).
Fig. 7
Fig. 7 Experimental characterization of the field after propagation in a 2.9° fused-silica wedge. [(a)–(c)] Wavefronts measured at λ1, λ2, and λ3, respectively, for a wedge introducing angular dispersion in the x direction. [(d)–(f)] Spatially resolved group delay for three different orientations of the wedge using Eq. (5). (g) Coefficients of the linear fit of the spatially resolved group delay versus x and y for eight different orientations of the wedge (markers) and expected value for the PFT (red line).
Fig. 8
Fig. 8 (a) Close-up of a full camera frame; [(b)–(d)] corresponding decoupled frames at 800 nm, 865 nm, and 930 nm.
Fig. 9
Fig. 9 Experimental characterization of the field after propagation in the two-singlet imaging system and a 1° fused-silica wedge. (a) Wavefront lineouts at λ1, λ2, and λ3 after removal of linear terms. (b) Spatially resolved group delay after the two-lens imaging system calculated by combining data at λ1 and λ2 using Eq. (5). (c) Coefficients of the linear fit of the spatially resolved group delay versus x and y, for eight different orientations of the 1° wedge (markers), and expected value for the PFT (red line).

Equations (7)

Equations on this page are rendered with MathJax. Learn more.

( M 1 M 2 M K )=( P 11 P 12 P 1K P 21 P 22 P K1 P KK )( A 1 A 2 A K ).
A j ( n,m )= l=1 4 P jl 1 M l ( n,m ) .
A 2 ( n,m )= P 22 1 M 2 ( n,m ) + P 24 1 [ M 4 ( n,m )+ M 4 ( n,m+1 ) ]/2 + P 21 1 [ M 1 ( n,m )+ M 1 ( n+1,m ) ]/2 + P 23 1 [ M 3 ( n,m )+ M 3 ( n,m+1 )+ M 3 ( n+1,m )+ M 3 ( n+1,m+1 ) ]/4 .
φ( x,y,ω )=( p x x+ p y y+q r 2 )( ω ω 0 ) + 2π λ [ t x x+ t y y+ 1 2R r 2 +Q( x,y ) ].
T ij ( x,y ) λ 0 c [ λ i 2π φ( x,y, λ i ) λ j 2π φ( x,y, λ j ) λ i λ j ]= p x x+ p y y+q r 2 .
q= λ cρ n λ ,
p= n λ λtan( α ) c .
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.