Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Damage modeling and statistical analysis of optics damage performance in MJ-class laser systems

Open Access Open Access

Abstract

Modeling the lifetime of a fused silica optic is described for a multiple beam, MJ-class laser system. This entails combining optic processing data along with laser shot data to account for complete history of optic processing and shot exposure. Integrating with online inspection data allows for the construction of a performance metric to describe how an optic performs with respect to the model. This methodology helps to validate the damage model as well as allows strategic planning and identifying potential hidden parameters that are affecting the optic’s performance.

© 2014 Optical Society of America

1. Introduction

Laser-induced damage in optics has been intensively studied the last several decades, driven by the development of megajoule (MJ)-class laser such as National Ignition Facility (NIF) [1, 2 ] for inertial confinement fusion (ICF) experiments. However, damage experiments have been primarily focused on using smaller aperture (mm2 – cm2) laser systems to probe the dependence of laser irradiance in optics [3–7 ]. It can be argued that these studies have been a success in that NIF not only delivered more than its 1.8-MJ goal (and 500-terawatt) in 2012 but have since maintained the capability to delivery these high energy shots routinely.

This accomplishment was not easily achieved, and many obstacles were overcome. One of the most difficult to overcome was the change of perception of damage from a “deterministic” process to “stochastic” process. Early studies of laser-induced damage in optics were often mired in measuring damage thresholds, with the perception that below the threshold, no damage would be expected. As a result, test fluences of witness samples that received no damage began to damage once full-sized optics were installed. This requires changing the initiation metric from damage threshold to damage density (ρ(ϕ)) [4], which recognizes that the damage threshold values that we have measured are actually the minimum detectable damage density (MDDD) dictated by the beam volume tested [8]. As data accumulated for laser-induced damage growth, it was also apparent that the rate of damage growth can better be described as probability distributions [9, 10 ]. As a result, empirical models developed from small-aperture offline laser system have shown capable of predicting multi-shot damage growth results on a full-size, MJ-class laser system using our damage model tool (OpticsX) [10].

This understanding helps to drive a new paradigm in operating high energy laser systems, which are designed to accommodate laser-induced damage through a carefully developed series of recycling strategies [11]. This requires timely online inspection of the optics [12, 13 ] and the use of spare optics to switch out endangered ones. This loop strategy [11] has been very successful as NIF has steadily increased the number of shots taken since becoming fully commissioned in 2009. Over the course of that time, hundreds of full-size (43 x 43 cm2) NIF optics have been successfully repaired [11]. The challenge for modeling laser-induced damage in support of this recycling strategy is in accurately modeling and effectively monitoring the optics’ performance in light of the large amount of data associated with each optic. Figure 1 outlines the many functionalities that this damage model is capable of supporting, from verifying that the process controls and quality and assurance put in place to deliver adequate optic damage performance to validating that the new damage model is providing the right accuracy to confidently predict the recycle rate (and therefore the procurement logistics) for different shot/operation campaigns.

 figure: Fig. 1

Fig. 1 Damage model in support of optics loop cycle required a number of functionalities to support various capabilities such as ability to verify that the process control is in place to produce adequate optics performance and ability to predict recycle rate for different operation configurations.

Download Full Size | PDF

We should recognize that additional parameters, such as system cleanliness, optics processing, and so forth, can evolve and affect the damage performance of the optics. This adds an additional layer of complexity: the damage model must be able to evolve to monitor the optics performance over time. In this paper, we will describe our approach to monitor and evaluate the performance of optics installed on NIF. Our approach consists of using our damage model (OpticsX) and relevant databases to create an optics performance table (OPT) to systematically track, analyze, and monitor optic performance accounting for varying shot histories, optics qualities, and environmental conditions (see Fig. 2 ). The OPT can support supply planning as it provides assurance that optics are performing as expected. It can directly impact damage understanding by verifying that the model is accurate, identifying where the outliers are, and by helping with operation by providing metrics that can be evaluated with various system parameters should the performance unexpectedly change because of adjustment in operation procedure or system configurations.

 figure: Fig. 2

Fig. 2 Flow chart on the process of generating the optics performance table (OPT).

Download Full Size | PDF

2. Damage prediction model

NIF has an extensive optics database that records the processing and movement of all of its optics. For each optic in our inventory, we use a series of work-order routing steps and installation orders to break up an optic’s history into a series of “birth” cycles using optic’s production reporting and tracking tool. This cycle begins when an optic is received from the vendor, to when it is first installed, to whenever it is refinished. This series of start-and-end dates can then be used to gather laser shot information for our laser performance model (LPOM) system to assemble the series of laser shots to which an optic has been exposed. The LPOM parameters, which consist of laser energy, laser pulse shape, beam area etc., are used as an input parameters to our damage model (OpticsX), which predicts the number of initiations and the size of the damage sites given a series of laser fluences (see Fig. 2). OpticsX uses experimentally measured fluence-dependent damage density, ρ(ϕ), from small-aperture test stations [4] to predict damage performance. OpticsX calculates the number of initiations, Ninit per optic given the laser fluence probability density function (pdf), f(ϕ) and the empirically-derived fluence-dependent damage density2, ρ(ϕ), as follows:

Ninit=Aρ(ϕ)f(ϕ)dϕ,
where A is the area of the optic. Because various pulse shapes are routinely used for different purposes, the pulse shape dependence is taken into consideration by scaling the fluence with respect to pulse shape [14]. As a result, ϕ used for initiation calculations is the damage equivalent fluence (e.g., equivalent to a 3ns Gaussian width). This is straightforward for a single shot, but when the optic is shot many times, the fluence distribution has to account for the accumulated effects of shot-to-shot contrast. This is what we describe as the Max-of-N effect, and it is crucial in the modeling of optics lifetime because shot-to-shot contrast variations can substantially increase the number of initiations and therefore reduce the optic’s lifetime. One way to visualize this effect is to imagine that the hot spot of the laser beam (i.e. fluence at high contrast) is moving around the optic from shot to shot, eventually it will have covered a majority of the optic surface, and as a result, substantially raise its accumulated mean fluence. From the optic’s defect precursor point of view, these “moving” hot spot will increase the likelihood that the higher fluence will coincide with the location of precursors, therefore, increasing the number of damage sites with shot number. However, as the optic’s defect precursors get depleted by the laser, we also should see a saturation of the number of initiations vs. shot. For multiple shot systems, the fluence distribution is replaced by the Max-of-N fluence distribution fmax(ϕ,N) where N is the number of shots [15, 16 ]. We did not include any potential effect of damage fatigue since each optic at most sees only hundreds of shots and the effect are not significant.

The damage size at initiation depends on the pulse duration of the initiation shot, but it is usually ~10 μm for ns-range pulses [17]. Each initiated damage site is then grown as follows:

Dn(ϕ)=Dn1eα(ϕ),
with the α being the growth coefficient, ϕ is the laser fluence, and n is the shot index. What we have learned is that although the growth coefficient depends on the laser fluence and the size of the damage site for a given shot fluence and site size, α is stochastic in nature. The growth coefficient is found to follow a Weibull distribution [9] with the probability density function (g(α)) given by:
g(α;λ,k)=kλ(αλ)k1exp((αλ)k),
where k and λ are the shape and scale parameters of the distribution. We have found the empirical dependence of these parameters with laser fluence, pulse duration, and damage site size [6, 9, 18 ]. OpticsX uses a Monte-Carlo simulation to predict damage growth [10].

Figure 3 shows the simulation result of a NIF final optic installed for 384 days with 199 shots taken. The input to OpticsX model consists of the series of equivalent laser fluence ϕ ranging below 1 J/cm2 to over 11 J/cm2 and an equivalent Gaussian width τ ranging from less than 1 ns to 17 ns as well as some assumptions on beam area and beam contrast as these parameters are not measured directly. The OpticsX model uses these inputs to construct a fluence probability density distribution (f(ϕ)) for each shot and the Max-of-N effect of multiple shots to calculate the expected value of initiations [15, 16 ]. The output of the model consists of the expected number of initiations and grown sites (defined as sites grown to be over 100 μm). We use 100 μm because of the accuracy of the online data is significantly better at sizes above that. The actual observed values are also plotted for comparison, and this shows that the predicted results are within several sites of the actual observed values. The observed values are based on results from extensive data analysis from using NIF’s final optic inspection diagnostic (FODI) tool [13] which has the sensitivity to detect micron-sized defects but can also have a large number of false positives. The algorithm uses advanced machine learning techniques to determine whether a detection is actually a valid, physical site (i.e., not a reflection or particles on top of the surface). We also use simple rules to determine whether it is a damage site, such as if its size grow with laser exposure, which we termed it as “grown site”. The predictions are surprisingly accurate considering the ambiguity of some of the input parameters. A slightly smaller assumed beam contrast (well within range of the system) would yield the same initiated sites as observed (~35 damage sites).

 figure: Fig. 3

Fig. 3 Equivalent shot fluence ϕ at 3ns Gaussian (top), equivalent Gaussian pulse width τ (middle), and the predicted number of initiated sites and grown sites vs. shots (bottom) for an installed optic. The observed initiated sites (o) and grown sites (o) after the shot sequence are also plotted.

Download Full Size | PDF

We used two metrics to simplify the performance analysis: one for initiation and one for growth. The damage initiation metric (DIM) is defined as the difference between the observed initiation (FODI observed initiations) and predictions by OpticsX over the optic’s lifetime. Similarly, the damage growth metric (DGM) is defined as the difference between the observed growth sites (FODI observed initiations that have surpassed 100 μm and predictions by OpticsX. For example, the simulation shown in Fig. 3 would have a DIM = −7 and DGM = 2. Note that a positive result means that the optic performs worse than our baseline prediction and a negative result means that the optic performs better than our prediction.

3. Optics performance analysis

An optics performance table (OPT) is created by combining the results of the OpticsX model along with data from various operation databases for the entire inventory of optics (see Fig. 2). For this study, we will focus on a critical final optic for NIF: the wedged focusing lens (WFL) [1]. The WFL is fabricated from fused silica and is precision processed and AR coated to meet the high NIF specification for finish and figure. Each optic’s lifetime consists of when it is first “born” (i.e., initial finish or refinish) until it is removed from the system. During an optic’s lifetime, it can be move from beam to beam and can also be taken out for optics recycle for damage site mitigation [11, 19 ]. As long as these additional steps do not replenish the damage precursors, we can consider these recycle processes to have little or no effect on the damage performance of the optic. Nevertheless, parameters that relate to the optic’s material process as well as installed and laser parameters are collected and stored in OPT (see Table 1 ) along with the performance metrics (DIM, DGM). It is worth noting that these damage metrics can be used to gauge the accuracy as well as the precision of the model. The proximity of these metrics to zero indicates the accuracy of the model. For example, the initiation metric is benchmarked against an empirically extrapolated damage density curve based on a sub-aperture test sample [4] and the extrapolation might not be equally accurate across the fluences. The variability of these metrics will indicate the precision of the model, and this is the most important result in terms of applicability of the model. A large variability will yield large uncertainties in our prediction while a small variability will give us confidence that our model has accounted for key factors that influence the damage performance. Remember, our empirical damage understanding comes from mostly offline testing facilities and it is not impossible to replicate the environment of a laser facility such as NIF entirely. By carefully tracking these seemingly innocuous parameters (see Table 1); we can validate their significance by finding correlations with respect to our damage metric.

Tables Icon

Table 1. Parameters and Example of Features Collected in OPT

In Fig. 4 , the cumulative probability density (CDF) of DIM and DGM are shown for >200 distinct WFL optics that have been processed to have nominally the same damage performance using the advance mitigation protocol (AMP) [20]. The model used assumes that the optics are not affected by the recycling process. The result shows that almost all of the optics performed worse than predicted (i.e., DIM > 0). The mean of DIM was 30 with a standard deviation of 47, implying on average that the optic had 30 more sites than our prediction with very large variations from optic to optic. This means that not only was our damage initiation model way off, but we are not consistently accounting for the variation in damage performance. Similarly, we see the same performance on the damage growth metric although it has a smaller mean and standard deviation. This is not surprising because the number of grown damage sites is always a subset of the damage sites that are initiated. For the remainder of this work, we will mainly focus on the initiation metric (DIM) as this is the main driver for system performance. This result implies a critical parameter is not properly taken into consideration, the most obvious factor being the potential effect of recycling on the performance of WFL optics.

 figure: Fig. 4

Fig. 4 Empirical cumulative density plot (CDF) of DIM (left) and DGM (right) for installed WFL.

Download Full Size | PDF

3.1 Material and processing

A NIF WFL optic is removed online and recycled when a pre-determined number of the damage sites exceed the pre-determined size. The recycle process consists of using lasers to mitigate the existing damage sites so they no longer grow with laser exposure; this procedure includes the following sequences:

  • 1. Chemical process to stripped and recoated the anti-reflection coatings
  • 2. Laser damage site mitigation
  • 3. Installation process

Because the optic is not being subjected to any grinding or polishing, it had been assumed that this process would not interrupt the lifetime of an optic as a refinishing or etching process would expect to. By tracking how often an optic has been recycle for damage site mitigation; we can breakdown its performance as a function of whether it has been through the recycle loop. Figure 5a shows optics performance (DIM) CDF for the WFL installed online as a function of whether it was recycled or not. It is clear from the plot that the optics that have been recycled consistently have a worse DIM than those that have not. There are two possibilities to correlate recycle and damage initiation metric. One possibility is that the recycle process as a whole degrades the initiation performance by changing the ρ(ϕ) itself so that subsequent reinstallation and exposure of the optic to laser fluence produces more damages than the original ρ(ϕ) would predict. A second possibility is that the recycle process as a whole resets the optic lifetime just as refinishing would, implying that the recycling procedure replenishes the precursor state of the optic as if it is new. The subsequent reinstallation and exposure of the optic to laser fluence would produce no benefit in optics whose precursors have already been initiated (i.e., rate of initiated sites with shot number start to slow after maximum energy has been reached) but would start initiating from the new precursor states.

 figure: Fig. 5

Fig. 5 Empirical cumulative density plot (CDF) of optics performance (DIM) for WFL installed evaluated based on not resetting precursor states (5a, left) and resetting precursor states (5b, right) whenever the optic is recycle

Download Full Size | PDF

These hypotheses can be investigated by changing the definition of an optic’s lifetime from when it is first “born” (i.e. initial finish or refinish) to include any process through the recycle facility, i.e. the precursor states replenish when the optic undergoes recycling procedure. In this analysis, an optic that went through the recycle facility three times would have three separate lifetimes. This significantly expands our statistics, as we now have ~1,400 optics lifetimes for >200 optic parts. The same damage modeling (OpticsX) is carried out, and the results are assembled into a new OPT. Figure 5b shows a series of optic performance CDFs for WFLs installed online with each CDF comprised of the number of recycles (i.e., 0 to 9). The results in Fig. 5b show that there is very little difference in an optic’s performance (i.e. DIM) as a function of the recycle loop for the WFL installed; this implies that our second hypothesis is most likely correct: that recycling replenishes damage precursors.

A detailed analysis of a typical WFL performance (which has been recycled 6 times) is plotted in Fig. 6 which shows the difference in prediction accuracy when the model resets the precursor state and therefore, the Max-of-N fluence of the shot whenever the optic is taken offline for recycling (marked in figure as gray lines). Figure 6 clearly shows that the simulation that reset the precursor states (i.e. treating the optic as a new optic) was able to accurately predict the number of damage sites observed online whereas the simulation that didn’t reset the precursor state only predict less than a third of the damage sites. Although the two simulations have identical inputs (i.e. shot fluences ϕ), the Max-of-N (MaxN) fluence differs dramatically between the two simulations. As is evident in Fig. 6, the simulation without resetting Max-of-N fluence (MaxN ϕ) saturates after the third recycle (third gray line, shot #87), correspondingly, the majority number of initiations also saturates at that point. For the next three recycles, although the shot fluence accumulates, the shots are usually fairly low and rarely exceeds the Max-of-N fluence; as a result, very little additional initiations were expected after shot #87. When Max-of-N fluence (along with the precursor state) is reset after each recycle, the number of expected initiations jump at each successive recycle as new precursors are being initiated. This is especially evident after three recycles when the optic’s precursor sites should be nearly exhausted if not for the resetting of the precursor state.

 figure: Fig. 6

Fig. 6 Damage equivalent fluence vs. shot (left) and damage simulation results (right) for same WFL with two simulation results: one which assumes no resetting of the precursor state and one which resets the precursor state (i.e. treated as new optic) whenever optic is recycled (gray lines).

Download Full Size | PDF

The measured damage sites presented in Fig. 6 come from online inspections over each of the recycle loops and are plotted versus the end of each recycle loop since NIF does not inspect every optic after every shot. There is a considerable lag of observed damage vs. simulation result since the online damage inspection will only labeled a potential detection as damage site when it has reach large enough size to have good sizing accuracy and have exposed to enough inspections to plot a growth trajectory. Therefore, the exact curve of damage sites vs. shot is not as accurate and we should focus more on the number of damage sites at the end of an optic’s lifetime. This result could have important implications for operations. The ability of the OPT to not only identify optic’s recycling as a surprisingly critical factor in determining optic’s performance but also provide compelling evidence (see Fig. 5) illustrate this methodology’s ability to evaluate for feature discovery.

3.2 Laser contrast

After accounting for the influence of optic’s recycling, our results have an average DIM ~11 with a standard deviation of ~23 (see Fig. 5b). This shows that while the experimentally derived ρ(ϕ) is fairly accurate, there are still variations in the DIM that are unaccounted for. Some of these variations can be attributed to the assumption we make of the beam contrast in our input. Currently, we do not have nearfield images of all of the 3w beam, and as a result, we assume that the contrast for the beam at 3w is ~12%. However, diagnostics from optical sensor output (after the main amplifier, before the transport mirror and final optics) have shown that the 1ω contrast of NIF beams can be anywhere from 9– 15% based on calibration shots where all the beams were set nominally the same. We have also studied results from our 2007 PDS (precision diagnostic system) campaign, where a single NIF beam had high resolution nearfield camera installed for both 1ω and 3ω beams and saw a trend of ~3% enhancement going from 1ω to 3ω [15, 16 ]. Using this result, we can optimize the damage model to allow the contrast to vary from 12 to 18% to account for variability in beam performance. The results (see Fig. 7 ) change the DIM substantially; the distribution is much sharper with substantial skew. The mean of optimized DIM is now 7.6.

 figure: Fig. 7

Fig. 7 Empirical cumulative density plot (CDF) of DIM for WFL installed using fixed contrast and optimized contrast with all optics or only new optic.

Download Full Size | PDF

A better metric for evaluating the accuracy of our model is to calculate the percentage of optics that are within 10 sites of the measured mean. This makes sense because our damage initiation model accounts for damage precursors but does not account for other possible mechanisms, such as hot spot intensification (from existing damage sites or phase object in other optics upstream) or contaminations in the system itself that can also lead to damage initiations. Damages from these mechanisms are nearly indistinguishable to our inspection diagnostic from our precursor-induced damage sites. For an optic of such large aperture, 10 is a reasonable tolerance to attribute to these mechanisms. Using this metric, our model precision is only 44% accurate at first (see Fig. 5a) – meaning we were able to successfully predict damage for 44% of the optics that we modeled, a disappointing result. However, after we include the Max-of-N resetting in the optics recycling, our accuracy rises to 58%, and after we factor in the ambiguity of the laser contrast (see Fig. 7), the accuracy is at 72% (see Table 2 ). This accuracy increases to 84% when we only include an optic’s first installation (i.e. new optics). This could mean that although each recycle in general only replenishes the precursor, each additional handling and processing step could potentially have a small additive effect and should be investigated further. This example illustrates a different aspect of using the OPT, to determine the strength of dependence of an input parameter that is ambiguous (i.e. not measured). Although the accuracy of the damage model increases by allowing the contrast to vary, we did not proved that the damage of the worse performing optics comes from the increased contrast of the shots, only that if the range of contrast is valid, it would have account for additional accuracy of 14%.

Tables Icon

Table 2. Damage Performance Accuracy

3.3 Model verification

Now that we have validated our damage model, we can use OPT to look back at the process control and the performance of our Quality & Assurance (Q&A) process. NIF fused silica blanks were purchased from various vendors and are AMP processed [20] with strict quality specifications over the years. Many of these specifications are aimed to maintain damage performance, however considering the difference in shot history throughout the years and lack of damage testing on full size production WFL, it would have been difficult to assess the damage performance of the optics over time. However, with OPT, we can easily plot the performances of the optics that have been similarly processed over the years (see Fig. 8a ). The results show that the performance of the WFLs over the years have been very consistent with perhaps a small percentage (~15%) processed in the very early stage of the AMP development. Similarly, when the performance of an optic is compared to optics installed on beams that have polarization rotator (PR) installed vs those that do not have PR installed, the performance is nearly identical which is in agreement with our assumption (see Fig. 8b). This example illustrates another aspect of OPT, which is to cross validate that factors that we believe shouldn’t contribute significantly to damage performance.

 figure: Fig. 8

Fig. 8 Empirical cumulative density plot (CDF) of DIM for WFL installed as a function of year installed on NIF (left) and a function of whether the polarization state of the beam its rotated (PR) or not (No PR) (right).

Download Full Size | PDF

Although each of NIF’s 192 beamlines is designed to be exactly the same apart from the exact beam path and the orientation in the NIF chamber, there could be potential differences in performance from beamline to beamline that could have consistent and permanent effects on damage performance. We can track these potential differences by constructing a 2D color map (see Fig. 9 ) to show the performance of AMP WFL (i.e., DIM) on each beamline as a function of time for the past 48 months. What becomes immediately clear is an improvement in the performance of the optic from the first 24 months to the last 24 months. There is some evidence of a few beamlines that seem to perform consistently poorly but there is more evidence of a correlation of time, evident by a band of horizontal ripples in the 35-42 month period. This could point to dependency on facility operation, such as shot type or frequency of shots and so forth. Continued tracking of DIM over time can help ensure the quality of the optics being used as well as the integrity of the laser performance and supporting system.

 figure: Fig. 9

Fig. 9 Color map of contrast-optimized DIM (optDIM) of AMP WFL as a function of the beamline it is on and the time (in months).

Download Full Size | PDF

4. Conclusion

Damage modeling and prediction for MJ-class laser system such as NIF is a complicated task because of the complexity of the facility, the number of beams, and the size of the optics involved. Because the models use empirical results from offline facilities with apertures that are ~2 orders of magnitude smaller and with different operation environments, it is important to be able to monitor and verify the damage model with respect to day-to-day operations as well as changes to facility or process. Using our damage model along with an integrated database, we were able to create an optics performance table (OPT) to evaluate system performance, verify damage models, as well as discover hidden parameters (such as recycle) that impact the system. This has allow us to refine our model and increase its accuracy from 44% to potentially over 80%, which validates our understanding of laser-induced damage as well as imparting confidence to plan or to project in different operation regimes.

Acknowledgments

The authors would like to acknowledge our wonderful colleagues at NIF for their contributions: Kris Fury for editing, T. Suratwala, M. Nostrand, P. Whitman for helpful discussions; W. Carr, D. Cross, M. Negres, M. Norton, and OSL team for all the offline damage data and rules that are the heart of the damage model, and finally M. Spaeth for setting us on this path so many years ago. This work is performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344 and funded through LLNL office of LDRD. (LLNL-JRNL-660594)

References and links

1. P. Wegner, J. Auerbach, T. Biesiada, S. Dixit, J. Lawson, J. Menapace, T. Parham, D. Swift, P. Whitman, and W. Williams, “NIF final optics system: frequency conversion and beam conditioning,” Proc. SPIE 5341, 180–189 (2004).

2. C. A. Haynam, P. J. Wegner, J. M. Auerbach, M. W. Bowers, S. N. Dixit, G. V. Erbert, G. M. Heestand, M. A. Henesian, M. R. Hermann, K. S. Jancaitis, K. R. Manes, C. D. Marshall, N. C. Mehta, J. Menapace, E. Moses, J. R. Murray, M. C. Nostrand, C. D. Orth, R. Patterson, R. A. Sacks, M. J. Shaw, M. Spaeth, S. B. Sutton, W. H. Williams, C. C. Widmayer, R. K. White, S. T. Yang, and B. M. Van Wonterghem, “National Ignition Facility laser performance status,” Appl. Opt. 46(16), 3276–3303 (2007). [CrossRef]   [PubMed]  

3. M. A. Norton, L. W. Hrubesh, Z. L. Wu, E. E. Donohue, M. D. Feit, M. R. Kozlowski, D. Milam, K. P. Neeb, W. A. Molander, A. M. Rubenchik, W. D. Sell, and P. Wegner, “Growth of laser initiated damage in fused silica at 351 nm,” Laser-Induced Damage in Optical Materials: 2000,” Proc. SPIE 4347, 468 (2001).

4. C. W. Carr, M. D. Feit, M. C. Nostrand, and J. J. Adams, “Techniques for qualitative and quantitative measurement of aspects of laser-induced damage important for laser beam propagation,” Meas. Sci. Technol. 17(7), 1958–1962 (2006). [CrossRef]  

5. M. A. Norton, A. V. Carr, C. W. Carr, E. E. Donohue, M. D. Feit, W. G. Hollingsworth, Z. Liao, R. A. Negres, A. M. Rubenchik, and P. J. Wegner, “Laser damage growth in fused silica with simultaneous 351 nm and 1053 nm irradiation,” Proceedings of the SPIE - The International Society for Optical Engineering 7132, 71321H (71328 pp.)-71321H (71328 pp.) (2008).

6. R. A. Negres, M. A. Norton, Z. M. Liao, D. A. Cross, J. D. Bude, and C. W. Carr, “The effect of pulse duration on the growth rate of laser-induced damage sites at 351 nm on fused silica surfaces,” Proceedings of the SPIE - The International Society for Optical Engineering 7504, 750412 (750410 pp.)-750412 (750410 pp.) (2009). [CrossRef]  

7. J. Bude, P. Miller, S. Baxamusa, N. Shen, T. Laurence, W. Steele, T. Suratwala, L. Wong, W. Carr, D. Cross, and M. Monticelli, “High fluence laser damage precursors and their mitigation in fused silica,” Opt. Express 22(5), 5839–5851 (2014). [CrossRef]   [PubMed]  

8. Z. M. Liao, R. Roussell, J. J. Adams, M. Runkel, W. T. Frenk, J. Luken, and C. W. Carr, “Defect population variability in deuterated potassium di-hydrogen phosphate crystals,” Opt. Mater. Express 2(11), 1612–1623 (2012). [CrossRef]  

9. R. A. Negres, Z. M. Liao, G. M. Abdulla, D. A. Cross, M. A. Norton, and C. W. Carr, “Exploration of the multiparameter space of nanosecond-laser damage growth in fused silica optics,” Appl. Opt. 50(22), D12–D20 (2011). [CrossRef]   [PubMed]  

10. Z. M. Liao, G. M. Abdulla, R. A. Negres, D. A. Cross, and C. W. Carr, “Predictive modeling techniques for nanosecond-laser damage growth in fused silica optics,” Opt. Express 20(14), 15569–15579 (2012). [CrossRef]   [PubMed]  

11. J. Folta, M. Nostrand, J. Honig, N. Wong, F. Ravizza, P. Geraghty, M. Taranowski, G. Johnson, G. Larkin, D. Ravizza, J. Peterson, B. Welday, and P. Wegner, “Mitigation of Laser Damage on National Ignition Facility Optics in Volume Production,” Proc. SPIE 2013, 88850Z (2013).

12. A. Conder, J. Chang, L. Kegelmeyer, M. Spaeth, and P. Whitman, “Final Optics Damage Inspection (FODI) for the National Ignition Facility,” Proc. SPIE 7797, 77970P (2010).

13. L. M. Kegelmeyer, R. Clark, R. R. Leach Jr, D. McGuigan, V. M. Kamm, D. Potter, J. T. Salmon, J. Senecal, A. Conder, M. Nostrand, and P. K. Whitman, “Automated optics inspection analysis for NIF,” Fusion Eng. Des. 87(12), 2120–2124 (2012). [CrossRef]  

14. C. W. Carr, J. B. Trenholme, and M. L. Spaeth, “Effect of temporal pulse shape on optical damage,” Appl. Phys. Lett. 90(4), 041110 (2007). [CrossRef]  

15. Z. M. Liao, J. Huebel, J. Trenholme, K. Manes, and C. W. Carr, “Modeling max-of-N fluence distribution using measured shot-to-shot beam contrast,” Appl. Opt. 50(20), 3547–3552 (2011). [CrossRef]   [PubMed]  

16. Z. M. Liao, J. Hubel, J. B. Trenholme, and C. W. Carr, “Modeling max-of-n fluence distribution for optics lifetime,” in Laser-Induced Damage in Optical Materials: 2011, G. J. Exarhos, V. E. Gruzdev, J. A. Menapace, D. Ristau, and M. J. Soileau, eds. (2011).

17. C. W. Carr, M. J. Matthews, J. D. Bude, and M. L. Spaeth, “The effect of laser pulse duration on laser-induced damage in KDP and SiO2 - art. no. 64030K,” in Laser-Induced Damage in Optical Materials: 2006, G. J. Exarhos, A. H. Guenther, and K. L. Lewis, eds. (2007), pp. K4030–K4030.

18. R. A. Negres, G. M. Abdulla, D. A. Cross, Z. M. Liao, and C. W. Carr, “Probability of growth of small damage sites on the exit surface of fused silica optics,” Opt. Express 20(12), 13030–13039 (2012). [CrossRef]   [PubMed]  

19. I. L. Bass, V. G. Draggoo, G. M. Guss, R. P. Hackel, and M. A. Norton, “Mitigation of laser damage growth in fused silica NIF optics with a galvanometer scanned CO2 laser - art. no. 62612A,” Conference on High-Power Laser Ablation VI, Taos, NM. 6261, A2612 (2006).

20. T. I. Suratwala, P. E. Miller, J. D. Bude, W. A. Steele, N. Shen, M. V. Monticelli, M. D. Feit, T. A. Laurence, M. A. Norton, C. W. Carr, and L. L. Wong, “HF-Based Etching Processes for Improving Laser Damage Resistance of Fused Silica Optical Surfaces,” J. Am. Ceram. Soc. 94(2), 416–428 (2011). [CrossRef]  

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (9)

Fig. 1
Fig. 1 Damage model in support of optics loop cycle required a number of functionalities to support various capabilities such as ability to verify that the process control is in place to produce adequate optics performance and ability to predict recycle rate for different operation configurations.
Fig. 2
Fig. 2 Flow chart on the process of generating the optics performance table (OPT).
Fig. 3
Fig. 3 Equivalent shot fluence ϕ at 3ns Gaussian (top), equivalent Gaussian pulse width τ (middle), and the predicted number of initiated sites and grown sites vs. shots (bottom) for an installed optic. The observed initiated sites (o) and grown sites (o) after the shot sequence are also plotted.
Fig. 4
Fig. 4 Empirical cumulative density plot (CDF) of DIM (left) and DGM (right) for installed WFL.
Fig. 5
Fig. 5 Empirical cumulative density plot (CDF) of optics performance (DIM) for WFL installed evaluated based on not resetting precursor states (5a, left) and resetting precursor states (5b, right) whenever the optic is recycle
Fig. 6
Fig. 6 Damage equivalent fluence vs. shot (left) and damage simulation results (right) for same WFL with two simulation results: one which assumes no resetting of the precursor state and one which resets the precursor state (i.e. treated as new optic) whenever optic is recycled (gray lines).
Fig. 7
Fig. 7 Empirical cumulative density plot (CDF) of DIM for WFL installed using fixed contrast and optimized contrast with all optics or only new optic.
Fig. 8
Fig. 8 Empirical cumulative density plot (CDF) of DIM for WFL installed as a function of year installed on NIF (left) and a function of whether the polarization state of the beam its rotated (PR) or not (No PR) (right).
Fig. 9
Fig. 9 Color map of contrast-optimized DIM (optDIM) of AMP WFL as a function of the beamline it is on and the time (in months).

Tables (2)

Tables Icon

Table 1 Parameters and Example of Features Collected in OPT

Tables Icon

Table 2 Damage Performance Accuracy

Equations (3)

Equations on this page are rendered with MathJax. Learn more.

N i n i t = A ρ ( ϕ ) f ( ϕ ) d ϕ ,
D n ( ϕ ) = D n 1 e α ( ϕ ) ,
g ( α ; λ , k ) = k λ ( α λ ) k 1 exp ( ( α λ ) k ) ,
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.