Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Acoustic and plasma sensing of laser ablation via deep learning

Open Access Open Access

Abstract

Monitoring laser ablation when using high power lasers can be challenging due to plasma obscuring the view of the machined sample. Whilst the appearance of the generated plasma is correlated with the laser ablation conditions, extracting useful information is extremely difficult due to the highly nonlinear processes involved. Here, we show that deep learning can enable the identification of laser pulse energy and a prediction for the appearance of the ablated sample, directly from camera images of the plasma generated during single-pulse femtosecond ablation of silica. We show that this information can also be identified directly from the acoustic signal recorded during this process. This approach has the potential to enhance real-time feedback and monitoring of laser materials processing in situations where the sample is obscured from direct viewing, and hence could be an invaluable diagnostic for laser-based manufacturing.

Published by Optica Publishing Group under the terms of the Creative Commons Attribution 4.0 License. Further distribution of this work must maintain attribution to the author(s) and the published article's title, journal citation, and DOI.

1. Introduction

Lasers are used for a variety of scientific and industrial areas, such as microscopy [13], spectroscopy [4,5], fusion [6], 3D mapping [7], telecommunications [8,9], cutting [10,11], marking [1214], welding [15,16] and deposition [17,18], owing to their versatility, speed, efficiency, and precision [19]. The industrial lasers global market size was $\$$5.7 billion in 2018 [20], and it is projected to experience a compound annual growth rate (CAGR) of 9.1% from 2022 to 2030 [21], with the main driving factor being the capability for shaping and cutting materials with high precision. As such, techniques focussed on improving the prediction and real-time monitoring capability of the laser parameters and the sample appearance during materials processing has the potential to be significantly valuable to both academia and industry. However, the plasma plume and associated light emitted during the laser ablation process can make it difficult to directly observe the workpiece during machining [22,23].

The plasma light observed when using high peak power and high average power lasers to ablate materials is caused by the conversion of material into plasma via ionization [24]. Whilst ultrashort pulsed lasers have the capability to ablate materials precisely with little or no heat affected zone (HAZ) [25,26], they do also produce plasmas. In laser ablation at relatively high intensity using ultrashort pulses (≥ 1012 Wcm−2), the laser-material interaction is predominantly multiphoton ionization [27]. Being able to demonstrate the ability to predict the laser power and then produce an image of an ablated region that would otherwise be obscured by plasma could be of benefit for real-time laser processing. However, modelling such phenomena can be difficult due to the highly nonlinear processes occurring during the laser ablation process.

Deep learning is a type of artificial intelligence that has been increasingly used over the past five years across many research domains, owing to the capability of graphics cards and development in algorithms [2830]. Deep learning is a data-driven approach that enables solutions to complex tasks to be identified through the processing of large amounts of data. Convolutional neural networks (CNNs) are a class of deep learning neural networks that have been successful in image recognition tasks [31], and they have been used in the field of lasers for a variety of laser-based tasks [32], such as laser-based spectroscopy [33] and particulate sensing [34]. Specifically for laser manufacturing, CNNs have been used for laser ablation monitoring [35,36], laser welding [37] and laser powder bed fusion [38].

A variant of the CNN known as a conditional adversarial neural network (cGAN) [39] has been used for a wide range of image-to-image translations [40]. This technique has been shown effective at synthesizing photos from sketches [41], colourising images [42], and transforming scattering patterns to images [43]. Related to laser machining, cGANs have been used for generating laser intensity patterns from fibres [44], laser welding [45], modelling laser machined shapes [46], and microstructure prediction of laser sintering [47].

Previous results have shown the application of plasma sensing for real-time composition monitoring for laser additive manufacturing [48], for non-contact monitoring of the laser welding [49], and have also shown the use of acoustic monitoring of laser welding [50,51] and laser ablation [52]. Here, we show how deep learning can be applied for the identification of laser machining parameters and sample appearance, directly from images of plasma (and acoustic spectra) recorded during laser machining.

2. Experimental methods

2.1 Setup

A schematic of the experimental setup is shown in Fig. 1. A Light Conversion Pharos SP was used to produce 190 fs, 1 mJ pulses with a central wavelength of 1030 nm. Pulses were focussed using a 20× Nikon objective onto the surface of a coverslip (500 micron thick silica which had itself been glued to a 0.75 mm glass slide) to give a laser fluence of approximately 98 mJcm−2 (with attenuation of the laser pulse energy scaling the laser fluence on the sample accordingly). The sample was attached to a Thorlabs XYZ motorized translation stage to allow for automated translation of the sample within the laser focus. A Basler acA4112-20uc camera (1914 × 1200, RGB) was used to image the surface of the ablated coverslip (imaged along the laser axis), whilst a Basler daA1920-160uc camera (4096 × 3000, RGB) was used to image the emitted plasma, with the camera orientated perpendicular to the laser axis. A microphone (Adafruit USB microphone, 22.2 mm × 18.3 mm × 7.0 mm, 22.1 kHz) was placed approximately 2 millimeters beneath the coverslip and positioned just outside of the laser path to avoid any damage. Single pulses from the laser were triggered using Python software to enable the synchronization of image capture and microphone recording.

 figure: Fig. 1.

Fig. 1. Schematic of the experimental setup and corresponding examples of experimentally collected images and acoustic spectra for a single laser pulse incident on the target sample.

Download Full Size | PDF

2.2 Data collection and processing

Data collection was automated using Python code to allow camera images to be recorded before, during and after the arrival of a single laser pulse on the sample, where the white-light source that was illuminating the sample was blocked via a shutter for the camera images recorded during the laser pulse. Audio data was recorded for a total of 3 seconds, starting from 1 second before each single laser pulse was triggered from the laser, and the top-down (surface image) and side-view (plasma) camera images were recorded with integration times of 1200 ms and 300 ms, respectively. This dataset per pulse (audio, plasma, and surface image data) was collected once for every position in a square array, such that a fresh unablated surface was used as the target for the next laser pulse, and therefore to collect the next set of data. A single laser pulse was fired at the sample surface each time the sample was moved by 75 microns in the X or Y direction in an 8 × 8 grid. Data collection therefore resulted in a square array of 64 ablated circles. Each pulse had a randomly chosen pulse energy (to avoid bias in the images due to surface topography etc.) from 0.2 mJ to 1 mJ in steps of 0.01 mJ. This 8 × 8 grid of data collection was repeated three times in new unablated regions of the sample surface to produce three sets of data for each pulse (i.e., ablated surface images, plasma images and acoustic signal) and for each pulse energy. The collection of each set of pulse energy data took approximately 8 seconds, due to the programmed synchronisation. The surface and plasma images were cropped and resized to 512 × 512 pixels.

The acoustic signal collected by the microphone was saved as a ‘.wav’ file and was loaded and converted to the spectral domain using MATLAB, via the Audio Toolbox and ‘audioFeatureExtractor’ function. Since the microphone recorded for 3 seconds, the signal from the laser was cropped from the spectrum using peak detection algorithms to identify the signal at the temporal position corresponding to when the pulse was incident on the sample surface. Once the peak was found, the signal was cropped to 42.3 ms with the peak centred in the time axis, and the frequencies were split into 94 bands from 11.4 kHz to 22.1 kHz (lower frequencies were removed due to low signal to noise levels), with the spectrum array being resized to a square array of 512 × 512, and duplicated into 3 channels to give a 512 × 512 RGB image.

2.3 Neural networks

A schematic of the application of the neural networks used in this work is presented in Fig. 2. Two CNNs with a regression output were used in the work (purple and grey boxes in Fig. 2), with one associated with plasma images and one associated with acoustic spectra images. Both neural networks consisted of 18 layers with data input size of 512 × 512 × 3 (the plasma or acoustic spectra images) and a regression output layer, hence giving a numerical output. The neural network architecture is drawn in more detail in Fig. 3(a), showing the decrease in spatial dimensions of the image as its fed through the neural network filters to give a 1 × 1 × 1 output. The training parameters for both neural networks consisted of a minibatch size of 2, an initial learn rate of 0.0001, ADAM optimizer [53], learn rate drop factor of 0.1 and period of 50, and they were both trained for 100 epochs. A total of 228 images were used in the training of each network, with testing carried out on 15 images for each neural network.

 figure: Fig. 2.

Fig. 2. Concept diagram of the application of the four neural networks used in this work, showing the use of plasma images and acoustic spectra for predicting the laser pulse energy and for predictive visualization of the appearance of the laser ablated samples.

Download Full Size | PDF

 figure: Fig. 3.

Fig. 3. Schematic of the architectures for the (a) CNN and (b) cGAN used for this work.

Download Full Size | PDF

Two cGANs were trained using the “pix2pix” architecture [40] for transforming images of either plasma or acoustic spectra into microscope images of ablated surfaces (blue and orange boxes in Fig. 2). More detail of the architecture of the generator part of the cGAN is shown in Fig. 3(b). The cGAN utilizes a U-Net structure to firstly reduce the size of the input image (plasma image or spectra) but then, unlike a CNN which reduces the output to a single pixel, the cGAN upscales the image to the dimensions of the input size image. Therefore, the input to the neural network is either a plasma image or acoustic spectra image of size 512 × 512 × 3, and the output is a generated surface image of size 512 × 512 × 3, which is then compared to an actual image using a discriminator network. The generator’s U-Net architecture has a contracting path (shown in cyan) and an expansive path (shown in grey). Green arrows represent down-sampling convolutions, orange arrows represent up-sampling convolutions, and red arrows represent skip connections. The feature maps from the expansive path are combined with the feature maps from the corresponding layer in the contracting path through skip connections. The multi-channel feature maps are represented by coloured rectangular boxes. The dimensions of each map are indicated inside the box and the number of channels is indicated below. Both networks were trained using the same parameters, which included a minibatch size of 2, a generator and discriminator learn rate of 2 × 10−4, an L1-to-GAN loss ratio of 100:1 with an ADAM optimizer and were trained for 10 epochs. A total of 234 pairs of images were used in the training of each neural network, with testing using 3 plasma images and 3 acoustic spectra.

No data augmentation was carried out. The neural networks were trained using MATLAB on a Microsoft Windows 10 computer workstation with 3× NVIDIA A4050s (20 GB each), with an Intel Xeon Gold 5222 CPU @3.80 GHz and 3.79 GHz, and 192 GB RAM.

3. Results and discussions

3.1 Pulse energy prediction

Figure 4 shows (black circles) predictions for laser pulse energies when inputting (a) plasma images and (b) acoustic spectra into their respective neural networks. A linear best fit is also shown. The R2 for the linear best fit for the predicted energies using the plasma images was 0.997, while the R2 associated with the acoustic spectra was 0.957. The standard deviation for all predicted energies was 0.13 mJ when using plasma images, and 0.30 mJ when using acoustic spectra. The root-mean-square error (RMSE) between the predicted and actual energies using the plasma images and acoustic spectra was approximately 0.02 mJ and 0.05 mJ, respectively. The RMSE was calculated as follows,

$$RMSE = \sqrt {\frac{1}{n}\mathop \sum \limits_{i = 1}^n {{|{{A_i} - {P_i}} |}^2}} $$
where A is the actual, P is the predicted, and n is the number of test data. It is evident that the predictions of the incident laser pulse energy were more accurate when using the plasma images than when using the acoustic spectra. We also use a 3-sigma method [54], which is a statistical calculation that shows the bounds of data points that lie within three standard deviations from a mean in a normal distribution (i.e., 99.7%), to help quantify the accuracy. This method has similarly been applied in laser induced breakdown spectroscopy [55,56]. The upper and lower 3-sigma limits were −0.029 mJ to 0.048 mJ for the plasma images, and −0.16 mJ to 0.12 mJ for the acoustic spectra.

 figure: Fig. 4.

Fig. 4. Actual and predicted pulse energy for test images associated with (a) plasma images and (b) acoustic spectra.

Download Full Size | PDF

By passing images through the dropout layer for each neural network, it is possible to identify which features in the images are most strongly activated. Figure 5 shows the channels that have the maximum activation for one image for each of the pulse energies tested in Fig. 4 for (a) plasma images and (b) acoustic spectra, whereby the activation map (shown as a jet colourmap) is overlayed onto the image with a 50% alpha value. The features of the plasma plume in Fig. 5(a) near the surface and along the edge of the plume are most strongly activated by the neural network in predicting the output. In addition, it appears that as the structure of the plume along the laser axis is also considered by the network, since the activation region extends further along the plume with increasing pulse energies. For the acoustic spectra in Fig. 5(b), the activation of different bands depends strongly on the pulse energy, with the lower pulse energy image (0.3 mJ) having stronger activations at lower frequencies and the highest pulse energy image (0.9 mJ) having stronger activation at higher frequencies.

 figure: Fig. 5.

Fig. 5. Examples of activation intensity for (a) plasma images and (b) acoustic spectra sent through their respective neural network dropout layer.

Download Full Size | PDF

3.2 Ablated surface image generation

Figure 6 shows the results for predicting the appearance of the surface after the pulse has been incident on the surface, directly from (a) side plasma images and (b) acoustic spectra. The neural network is shown capable of reconstructing the approximate shape of the surface structure, with the larger holes being generated for larger pulses energies, in agreement with the actual (i.e., experimentally collected) images. Structural similarity index measurement (SSIM), which is a measurement of the similarity between two images (which has a value of 1 if images are identical and a value of 0 if there is no similarity), was calculated for the generated images compared with the actual ablated images. The SSIM for 0.36 mJ, 0.68 mJ and 0.97 mJ energy images generated using the side plasma was 0.94, 0.94 and 0.92, respectively. For the images generated using the acoustic spectrum as input, the SSIM was 0.92, 0.90 and 0.90, for pulse energies of 0.36 mJ, 0.68 mJ and 0.97 mJ, respectively. It is evident that the plasma becomes larger with increasing pulse energies, and the shape also changes. The acoustic spectra also vary between energies with the most intense frequencies varying for all three spectra in Fig. 6(b). It is also clear that for 0.68 mJ, the reconstruction is not as accurate as that for the other pulse energies when using acoustic spectra in (b).

 figure: Fig. 6.

Fig. 6. Generated images and actual images of laser ablated surface for (a) plasma images and (b) acoustic spectra from different laser pulse energies.

Download Full Size | PDF

4. Conclusion

In conclusion, we have shown the ability of neural networks to predict the incident laser pulse energy of an ablated region based directly from either images of the plasma emitted upon ablation, or the acoustic spectra. We have also shown that neural networks have the potential to predict the appearance of ablated regions on the sample directly from plasma or acoustic data. This approach could be invaluable in the laser materials processing industry in situations where viewing the workpiece during machining is restricted due to high brightness of the plasma created during ablation.

Funding

Engineering and Physical Sciences Research Council (EP/P027644/1, EP/T026197/1, EP/W028786/1).

Disclosures

The authors declare no conflicts of interest.

Data availability

Data underlying the results presented in this paper are available in Ref. [57].

References

1. A. Canette and R. Briandet, “MICROSCOPY | Confocal Laser Scanning Microscopy,” in Encyclopedia of Food Microbiology (2nd Edition), C. A. Batt and M. Lou Tortorello, eds., (Academic Press, 2014), pp. 676–683.

2. A. Jeffrey, and Others, “Tutorial review—Applications of confocal laser scanning microscopy in in-situ mapping,” Analyst 118(1), 1–9 (1993). [CrossRef]  

3. W. Denk, J. H. Strickler, and W. W. Webb, “Two-photon laser scanning fluorescence microscopy,” Science 248(4951), 73–76 (1990). [CrossRef]  

4. D. Fernandes Andrade, E. R. Pereira-Filho, and D. Amarasiriwardena, “Current trends in laser-induced breakdown spectroscopy: a tutorial review,” Appl. Spectrosc. Rev. 56(2), 98–114 (2021). [CrossRef]  

5. P. Campbell, I. D. Moore, and M. R. Pearson, “Laser spectroscopy for nuclear structure physics,” Prog Part Nucl Phys 86, 127–180 (2016). [CrossRef]  

6. M. Dunne, “A high-power laser fusion facility for Europe,” Nat. Phys. 2(1), 2–5 (2006). [CrossRef]  

7. M. Flood, “Laser altimetry: From science to commerical lidar mapping,” Prog. Part. Nucl. Phys. 67(11), 1209–1217 (2001).

8. A. Bellemare, “Continuous-wave silica-based erbium-doped fibre lasers,” Prog. Quantum Electron. 27(4), 211–266 (2003). [CrossRef]  

9. R. J. Mears and S. R. Baker, “Erbium fibre amplifiers and lasers,” Opt. Quantum Electron. 24(5), 517–538 (1992). [CrossRef]  

10. A. Wetzig, P. Herwig, J. Hauptmann, R. Baumann, P. Rauscher, M. Schlosser, T. Pinder, and C. Leyens, “Fast laser cutting of thin metal,” Procedia Manuf 29, 369–374 (2019). [CrossRef]  

11. A. N. Fuchs, M. Schoeberl, J. Tremmer, and M. F. Zaeh, “Laser cutting of carbon fiber fabrics,” Phys. Procedia 41, 372–380 (2013). [CrossRef]  

12. S. Valette, P. Steyer, L. Richard, B. Forest, C. Donnet, and E. Audouard, “Influence of femtosecond laser marking on the corrosion resistance of stainless steels,” Appl. Surf. Sci. 252(13), 4696–4701 (2006). [CrossRef]  

13. J. Diaci, D. Bračun, A. Gorkič, and J. Možina, “Rapid and flexible laser marking and engraving of tilted and curved surfaces,” Opt Lasers Eng 49(2), 195–199 (2011). [CrossRef]  

14. C. Velotti, A. Astarita, C. Leone, S. Genna, F. M. C. Minutolo, and A. Squillace, “Laser marking of titanium coating for aerospace applications,” Procedia CIRP 41, 975–980 (2016). [CrossRef]  

15. T. Tamaki, W. Watanabe, and K. Itoh, “Laser micro-welding of transparent materials by a localized heat accumulation effect using a femtosecond fiber laser at 1558 nm,” Opt. Express 14(22), 10460–10468 (2006). [CrossRef]  

16. J. Górka, W. Suder, M. Kciuk, and S. Stano, “Assessment of the Laser Beam Welding of Galvanized Car Body Steel with an Additional Organic Protective Layer,” Materials 16(2), 670 (2023). [CrossRef]  

17. J. A. Grant-Jacob, S. J. Beecher, J. J. Prentice, D. P. Shepherd, J. I. Mackenzie, and R. W. Eason, “Pulsed laser deposition of crystalline garnet waveguides at a growth rate of 20 µm per hour,” Surf. Coat. Technol. 343(15), 7–10 (2018). [CrossRef]  

18. E. Morintale, C. Constantinescu, and M. Dinescu, “Thin films development by pulsed laser-assisted deposition,” Physics AUC 20(1), 43–56 (2010).

19. M. Sparkes and W. M. Steen, “Light” industry: an overview of the impact of lasers on manufacturing,” Advances in Laser Materials Processing 1–22 (2018).

20. Fortune Business Insights, “Industrial Lasers Market Size, Share & Industry Analysis, By Product (CO2 Laser, Solid State Laser, Diode/Excimer Laser, Fiber Laser and Others), By Application (Macro Processing, Micro Processing, Marking & Engraving), and Regional Forecast, 2019-2026,”.

21. Grand View Research, “Laser Processing Market Size, Share & Trends Analysis Report By Product (Gas, Solid-state, Fiber), By Process (Material Processing, Marking & Engraving, Micro-processing), By Application, And Segment Forecasts, 2022 - 2030,”.

22. Y. Kawahito, N. Matsumoto, M. Mizutani, and S. Katayama, “Characterisation of plasma induced during high power fibre laser welding of stainless steel,” Sci. Technol. Weld. Joining 13(8), 744–748 (2008). [CrossRef]  

23. J. Greses, P. A. Hilton, C. Y. Barlow, and W. M. Steen, “Plume attenuation under high power Nd: YAG laser welding,” in International Congress on Applications of Lasers & Electro-Optics (Laser Institute of America, 2002), 2002(1), p. 47727.

24. A. Bogaerts and Z. Chen, “Effect of laser parameters on laser ablation and laser-induced plasma formation: A numerical modeling investigation,” Spectrochim. Acta, Part B 60(9-10), 1280–1307 (2005). [CrossRef]  

25. C. Momma, S. Nolte, B. N. Chichkov, F. v Alvensleben, and A. Tünnermann, “Precise laser ablation with ultrashort pulses,” Appl. Surf. Sci. 109-110, 15–19 (1997). [CrossRef]  

26. S. Nolte, C. Momma, H. Jacobs, A. Tünnermann, B. N. Chichkov, B. Wellegehausen, and H. Welling, “Ablation of metals by ultrashort laser pulses,” J. Opt. Soc. Am. B 14(10), 2716–2722 (1997). [CrossRef]  

27. X. Zhao and Y. C. Shin, “Femtosecond laser ablation of aluminum in vacuum and air at high laser intensity,” Appl. Surf. Sci. 283, 94–99 (2013). [CrossRef]  

28. A. HajiRassouliha, A. J. Taberner, M. P. Nash, and P. M. F. Nielsen, “Suitability of recent hardware accelerators (DSPs, FPGAs, and GPUs) for computer vision and image processing algorithms,” Signal Process Image Commun 68, 101–119 (2018). [CrossRef]  

29. Y. Sun, N. B. Agostini, S. Dong, and D. Kaeli, “Summarizing CPU and GPU Design Trends with Product Data,” (n.d.).

30. Y. LeCun, Y. Bengio, and G. Hinton, “Deep learning,” Nature 521(7553), 436–444 (2015). [CrossRef]  

31. A. Krizhevsky, I. Sutskever, and G. E. Hinton, “Imagenet classification with deep convolutional neural networks,” in Advances in Neural Information Processing Systems (2012), pp. 1097–1105.

32. B. Mills and J. Grant-Jacob, “Lasers that learn: the interface of laser machining and machine learning,” IET Optoelectron. 15(5), 207–224 (2021). [CrossRef]  

33. C.-S. Ho, N. Jean, C. A. Hogan, L. Blackmon, S. S. Jeffrey, M. Holodniy, N. Banaei, A. A. E. Saleh, S. Ermon, and J. A. Dionne, “Rapid identification of pathogenic bacteria using Raman spectroscopy and deep learning,” Nat. Commun. 10(1), 4927 (2019). [CrossRef]  

34. J. A. Grant-Jacob, S. Jain, Y. Xie, B. S. Mackay, M. D. T. McDonnell, M. Praeger, M. Loxham, D. J. Richardson, R. W. Eason, and B. Mills, “Fibre-optic based particle sensing via deep learning,” JPhys Photonics 1(4), 044004 (2019). [CrossRef]  

35. Y. Xie, D. J. Heath, J. A. Grant-Jacob, B. S. Mackay, M. D. T. McDonnell, M. Praeger, R. W. Eason, and B. Mills, “Deep learning for the monitoring and process control of femtosecond laser machining,” JPhys Photonics 1(3), 035002 (2019). [CrossRef]  

36. B. Mills, D. J. Heath, J. A. Grant-Jacob, Y. Xie, and R. W. Eason, “Image-based monitoring of femtosecond laser machining via a neural network,” JPhys Photonics 1(1), 015008 (2018). [CrossRef]  

37. D. Ma, P. Jiang, L. Shu, and S. Geng, “Multi-sensing signals diagnosis and CNN-based detection of porosity defect during Al alloys laser welding,” J Manuf Syst 62, 334–346 (2022). [CrossRef]  

38. L. Scime and J. Beuth, “A multi-scale convolutional neural network for autonomous anomaly detection and classification in a laser powder bed fusion additive manufacturing process,” Addit Manuf 24, 273–286 (2018). [CrossRef]  

39. I. Goodfellow, J. Pouget-Abadie, M. Mirza, B. Xu, D. Warde-Farley, S. Ozair, A. Courville, and Y. Bengio, “Generative Adversarial Networks,” Commun. ACM 63(11), 139–144 (2020). [CrossRef]  

40. P. Isola, J.-Y. Zhu, T. Zhou, and A. A. Efros, “Image-to-Image Translation with Conditional Adversarial Networks,” in 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR) (IEEE, 2017), pp. 5967–5976.

41. Y.-C. Liu, W.-C. Chiu, S.-D. Wang, and Y.-C. F. Wang, “Domain-Adaptive generative adversarial networks for sketch-to-photo inversion,” in 2017 IEEE 27th International Workshop on Machine Learning for Signal Processing (MLSP) (2017), pp. 1–6.

42. S. Salve, T. Shah, V. Ranjane, and S. Sadhukhan, “Automatization of Coloring Grayscale Images Using Convolutional Neural Network,” in 2018 Second International Conference on Inventive Communication and Computational Technologies (ICICCT) (2018), pp. 1171–1175.

43. J. A. Grant-Jacob, M. Praeger, M. Loxham, R. W. Eason, and B. Mills, “Lensless imaging of pollen grains at three-wavelengths using deep learning,” Environ. Res. Commun. 2(7), 075005 (2020). [CrossRef]  

44. B. Mills, J. A. Grant-Jacob, M. Praeger, R. W. Eason, J. Nilsson, and M. N. Zervas, “Single step phase optimisation for coherent beam combination using deep learning,” Sci. Rep. 12(1), 5188 (2022). [CrossRef]  

45. C. Liu, J. Shen, S. Hu, D. Wu, C. Zhang, and H. Yang, “Seam tracking system based on laser vision and CGAN for robotic multi-layer and multi-pass MAG welding,” Eng Appl Artif Intell 116, 105377 (2022). [CrossRef]  

46. M. D. T. McDonnell, J. A. Grant-Jacob, Y. Xie, M. Praeger, B. S. Mackay, R. W. Eason, and B. Mills, “Modelling laser machining of nickel with spatially shaped three pulse sequences using deep learning,” Opt. Express 28(10), 14627–14637 (2020). [CrossRef]  

47. J. Tang, X. Geng, D. Li, Y. Shi, J. Tong, H. Xiao, and F. Peng, “Machine learning-based microstructure prediction during laser sintering of alumina,” Sci. Rep. 11(1), 10724 (2021). [CrossRef]  

48. L. Song, W. Huang, X. Han, and J. Mazumder, “Real-time composition monitoring using support vector regression of laser-induced plasma for laser additive manufacturing,” IEEE Trans. Ind. Electron. 64(1), 633–642 (2017). [CrossRef]  

49. L. Li, D. J. Brookfield, and W. M. Steen, “Plasma charge sensor for in-process, non-contact monitoring of the laser welding process,” Meas. Sci. Technol. 7(4), 615–626 (1996). [CrossRef]  

50. H. Gu and W. W. Duley, “A statistical approach to acoustic monitoring of laser welding,” J. Phys. D: Appl. Phys. 29(3), 556–560 (1996). [CrossRef]  

51. A. Sun, E. Kannatey-Asibu Jr, and M. Gartner, “Sensor systems for real-time monitoring of laser weld quality,” J. Laser Appl. 11(4), 153–168 (1999). [CrossRef]  

52. C. Stauter, P. Gérard, J. Fontaine, and T. Engel, “Laser ablation acoustical monitoring,” Appl. Surf. Sci. 109-110, 174–178 (1997). [CrossRef]  

53. D. P. Kingma and J. Ba, “Adam: A method for stochastic optimization,” arXiv, arXiv:1412.6980 (2014). [CrossRef]  

54. G. L. Long and J. D. Winefordner, “Limit of Detection A Closer Look at the IUPAC Definition,” Anal. Chem. 55(07), 712A–724A (1983). [CrossRef]  

55. A. P. Rao, P. R. Jenkins, J. D. Auxier, M. B. Shattan, and A. K. Patnaik, “Development of advanced machine learning models for analysis of plutonium surrogate optical emission spectra,” Appl. Opt. 61(7), D30–D38 (2022). [CrossRef]  

56. B. T. Manard, E. M. Wylie, and S. P. Willson, “Analysis of Rare Earth Elements in Uranium Using Handheld Laser-Induced Breakdown Spectroscopy (HH LIBS),” Appl. Spectrosc. 72(11), 1653–1660 (2018). [CrossRef]  

57. J. A. Grant-Jacob, B. Mills, and M. N. Zervas, “Dataset to support the publication ‘Acoustic and plasma sensing of laser ablation via deep learning’,” University of Southampton (2023), https://doi.org/10.5258/SOTON/D2609.

Data availability

Data underlying the results presented in this paper are available in Ref. [57].

57. J. A. Grant-Jacob, B. Mills, and M. N. Zervas, “Dataset to support the publication ‘Acoustic and plasma sensing of laser ablation via deep learning’,” University of Southampton (2023), https://doi.org/10.5258/SOTON/D2609.

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (6)

Fig. 1.
Fig. 1. Schematic of the experimental setup and corresponding examples of experimentally collected images and acoustic spectra for a single laser pulse incident on the target sample.
Fig. 2.
Fig. 2. Concept diagram of the application of the four neural networks used in this work, showing the use of plasma images and acoustic spectra for predicting the laser pulse energy and for predictive visualization of the appearance of the laser ablated samples.
Fig. 3.
Fig. 3. Schematic of the architectures for the (a) CNN and (b) cGAN used for this work.
Fig. 4.
Fig. 4. Actual and predicted pulse energy for test images associated with (a) plasma images and (b) acoustic spectra.
Fig. 5.
Fig. 5. Examples of activation intensity for (a) plasma images and (b) acoustic spectra sent through their respective neural network dropout layer.
Fig. 6.
Fig. 6. Generated images and actual images of laser ablated surface for (a) plasma images and (b) acoustic spectra from different laser pulse energies.

Equations (1)

Equations on this page are rendered with MathJax. Learn more.

R M S E = 1 n i = 1 n | A i P i | 2
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.