Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Neural-network-assisted in situ processing monitoring by speckle pattern observation

Open Access Open Access

Abstract

We propose a method to monitor the progress of laser processing using laser speckle patterns. Laser grooving and percussion drilling were performed using femtosecond laser pulses. The speckle patterns from a processing point were monitored with a high-speed camera and analyzed with a deep neural network. The deep neural network enabled us to extract multiple information from the speckle pattern without a need for analytical formulation. The trained neural network was able to predict the ablation depth with an uncertainty of 2 μm, as well as the material under processing, which will be useful for composite material processing.

© 2020 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

1. Introduction

Laser-based micro-structuring is a powerful tool for precision manufacturing and is considered a key element for future automized manufacturing. The main physical phenomenon behind laser-based micro-structuring is laser ablation: the process of surface material removal by rapid energy injection with laser pulses [1,2]. In particular, the use of ultrashort pulses enables material removal with negligible heat degradation. As such, much research has been devoted to properly control and understand this process.

An important parameter in laser ablation is the rate of material removal, or the ablation rate. The ablation rate depends on various factors, such as material properties, surface morphology, pulse energy, and the laser repetition rate [37]. For precision manufacturing and automized process optimization, real-time monitoring of this processing rate is of great importance. Specifically, the monitoring of the cumulative ablation rate, or the processed depth or volume, is important for hole drilling or groove processing.

Various methods have been developed to monitor the processing status in a non-contact way, including acoustic and plasma detection [811]. Specifically, laser interferometry is a powerful technique to quantify the progress of laser processing. In laser interferometry, a monitoring beam is divided into two paths: one for the actual monitoring, and another as a reference. The monitoring laser beam is projected onto a processing point, and the reflection of this light is directed back into an interferometer. The interference signal between this beam and the reference is measured, where the signal strength depends on the relationship between the optical path lengths of the two beams. By properly tracking changes in this signal during processing, it is possible to extract real-time processed depth information. Intensive studies have been performed to increase the precision and acquisition rate of laser interferometry setups, as well as to make the overall system convenient to install. Currently, there are two major implementations, differentiated by their choice of reference beam path: cross-interferometry and self-mixing. In cross-interferometry, a separate reference arm is placed in the interferometer, which enables the determination of absolute distance in exchange for a comparatively large installation footprint [1216]. In self-mixing-interferometry, the intact surface outside the ablation region is used for the reference path [1719]. Although the self-interferometry method does not require a reference arm, the counting of net oscillation is necessary to quantify the total ablated depth, limiting its uses to situations with gradual depth changes, such as percussion drilling.

As an alternative method to optically monitor the laser processing status, we focus on utilizing the spatial coherence of a probe light source instead of only the temporal coherence as in the aforementioned interferometry techniques. When spatially coherent light is scattered by a rough surface and projected onto a screen, the scattered light on the screen exhibits a highly irregular intensity profile, known as a laser speckle pattern. The speckle pattern arises from interference of wavefronts scattered at different positions of the surface. As such, the speckle pattern carries a trove of information on the material surface condition. It has been used for various measurement and diagnostic applications, such as in surface roughness measurements [20], flow measurements [21,22], and strain measurements [23]. Furthermore, techniques for three-dimensional reconstruction from the speckle pattern have been developed, realizing sub-millimeter precision [24,25]. As speckle-pattern analysis only requires illumination and an observation camera, it should be straightforward to implement into laser processing setups. Moreover, the two-dimensional nature of the speckle pattern observation should provide additional information compared to conventional one-point interferometric measurements. The major challenge here then becomes extracting meaningful physical characteristics from the complex speckle pattern.

In this study, we demonstrated the successful extraction of multiple information from three sequential frames of speckle patterns measured in situ during laser processing. Figure 1 shows the concept of our method. We illuminate the processed surface with a coherent light source, and observe the reflected speckle pattern with a high-speed camera. Traditionally, the quantification of speckle patterns required a formulation for application- and measurement- specific analysis. Here, we utilize neural networks to practically overcome this difficulty. Deep learning are emerging technology to ties multi-dimensional vectors to other multi-dimensional vectors with the neural networks using a number of datasets [2632]. A neural network consists of a vast number of combinations of nonlinear functions. Each nonlinear function has a few tuning parameters. In total, there are more than hundreds of thousands of tuning parameters, and these parameters are adjusted in the training process to better reproduce the output labels from the input data. This technique has been applied to a wide range of fields ranging from image recognition, image synthesis, and natural language processing. Taking speckle pattern analysis as an example, the classification and reconstruction of images from speckle patterns that have passed through scattering media or multimode fibers have been performed [2930,32]. In this study, we use a number of experimental datasets to train a neural network to approximate the complex nonlinear functions required to extract meaningful physical values from a speckle image. We show that a deep neural network can use speckle patterns to extract multiple information regarding the processing status during laser processing, such as ablated material type, depth, and volume. The simple setup should be a great candidate for next-generation process monitoring technology.

 figure: Fig. 1.

Fig. 1. Process monitoring using laser speckle with a deep neural network.

Download Full Size | PDF

2. Experimental setup

The experimental setup is shown in Fig. 2. We used plates of aluminum, copper, and nickel for the sample. The roughness Ra of the plates before the processing was measured to be typically around 1 μm. The sample was placed on a two-axis stage, which moved at a constant velocity during groove processing. Laser pulses for rapid surface engraving were provided by a Ti: Sapphire regenerative amplifier with a wavelength of 800 nm, pulse duration of 35 fs, and a repetition rate of 1 kHz. The pulse energy of the laser pulses was adjusted by a motor-driven half-wave plate and a polarizing beam splitter. The laser pulses were then focused onto the sample surface through a lens with a focal length of 150 mm; the spot size on the target was approximately 15 μm. A mechanical shutter was used to switch on and off the laser irradiation. A laser beam for monitoring was provided from a laser diode with a wavelength of 633 nm. The monitoring beam was spatially filtered through a 50-μm pinhole, and then coaxially merged with the femtosecond laser pulses through a dichroic mirror. The spot size on the target was 20 μm, which was 1.5 times larger than that of the processing beam. Laser speckle patterns from the target surface were recorded with a high-speed camera through a collector lens with a focal length of 50 mm. The lens was placed between the target and the high-speed camera to match the Fourier plane of the lens with the detector plane. A laser line filter at the wavelength of 632.8 nm was inserted in between the lens and the camera. The high-speed camera recorded a speckle pattern for each femtosecond laser pulse. The camera was synchronized with a 1-kHz TTL signal from the regenerative amplifier. The high-speed camera was also synchronized with the mechanical shutter, first nine frames without laser irradiation due to a delayed response of the mechanical shutter. The exposure time was set to 500 μs; this exposure was also synchronized with the monitoring laser diode. The acquired image frames were sent to the computer in real time through CoaXPress cables. The pulse energies were varied from 10 to 60 μJ, and the scan speeds were varied from 1.0 to 3.5 mm/s.

 figure: Fig. 2.

Fig. 2. Experimental setup. The abbreviations are as follows: PH: pinhole, DG: delay generator, MS: mechanical shutter, DM: dichroic mirror, MC: micro-computer, HC: high-speed camera, and SPF: short pass filter.

Download Full Size | PDF

The depth profiles of the grooves were measured using a confocal 3D microscope with a 50x objective lens. A typical depth profile is shown in Fig. 3(a), where the laser pulses were swept along the horizontal axis. The cross-section of the depth profile along the laser trace is shown in Fig. 3(b). When illuminated by the monitoring diode, the macroscopic groove shape and the microscopic groove surface roughness combine to form the total laser speckle pattern. Figure 3(c) shows typical sequential images of the observed speckle patterns. As the processing proceeds, the changes in the surface structure and the monitoring position induce changes in the laser speckle patterns. The diffraction angles for the left and right edge of the images are 30 and 60 degrees, respectively.

 figure: Fig. 3.

Fig. 3. (a) Three-dimensional depth profile of a laser processed groove. (b) Cross-sectional of the depth profile. (c) Typical laser speckle images of three sequential frames.

Download Full Size | PDF

3. Data preparation and training

The resolution of the acquired images was 4080 × 480 pixels, which was reduced down to 255 × 25 pixels by box averaging and cropping. The brightness and contrast of each image were normalized between 0 and 1. Due to this normalization, only spatial patterns of the images were taken into account in the following analysis. The structure of the neural network is consisted of multiple convolutional neural networks as depicted in Fig. 4. The rectified linear unit was used for the activation function. We used TensorFlow 2 as a deep learning framework. Three sequential image frames were used as input data, and the measurement value and material vectors were used as label data. Here, three sequential image frames refer to the consecutive n-2th, n-1th, and nth image frames, where the measurement value of the nth frame is used as the label data (see Appendix Fig. 8). We found that the use of multiple sequential frames as an input improved the validation loss. The material vector was represented by a one-hot vector of material species. The loss function was composed from the root mean square of the ablated depth and the cross-entropy of the material vector. The value ranges of the label data used in training were 0-35 μm for aluminum, 0-20 μm for copper, and 0-8 μm for nickel. The differences in the value ranges for each material are due to differences in the ablation depths for the same processing parameters. The neural network was trained with 13,000 datasets which included all three materials. The batch size was 256, and the Adam algorithm was used for optimization.

 figure: Fig. 4.

Fig. 4. Structure of the neural network. CNN: Convolutional neural network. ReLU: Rectified linear unit. FC: Fully connected network. K: Kernel size. S: Stride size.

Download Full Size | PDF

4. Results

Upon three-sequential image input, the trained neural network predicts the ablation depth and the target material. Figure 5 shows the validation results of the trained neural network. Validation data were obtained separately from the training data. The 32-34 combination of pulse energies and scan speed for the three materials resulted in 98 datasets. Each dataset consisted of 10 input data, which were obtained with the same processing parameters and associated with corresponding label data. The net 980 datasets were used for the validation. The estimated ablation depth by the neural network was plotted as a function of the experimentally measured ablation depth. Good correspondence was found between estimated and measured values, with less than 2 μm uncertainty for more than 80% of the data points. Most of the dataset showed less than 1 μm uncertainty below an ablation depth of 5 μm. Two factors can contribute to the discrepancy between the measured and predicted data. First, the laser-induced surface roughness as per in Figs. 2(a)-(b) creates an ambiguity in depth determination. The uncertainty was in a range of a few micrometer. Second, deformation of the groove shape during laser processing occurred possibly due to debris redeposition and the re-solidification process. This factor is negligible for copper, whereas it is more pronounced for aluminum. We note that the prediction took 0.09 ms using a standard commercially available GPU; this figure shows that with proper configuration of the camera data acquisition board and GPU memory, real-time feedback of the laser processing should be possible. Note that the speckle pattern itself is determined by the geometric structure of the material's surface and does not depend on the physical properties of the material. Thus our method can be applied for any material of interest.

 figure: Fig. 5.

Fig. 5. Predicted ablation depth vs. measured ablation depth for aluminum, copper, and nickel.

Download Full Size | PDF

In addition, the speckle patterns can simultaneously predict the material type being processed. Figure 6 shows the predicted logits of materials, where the logit is defined as log p – log(1-p), where the p is the probability of being the material. A high logits value output indicates a high probability of the input being a certain material, and vice-versa for a low value. The horizontal axis indicates the indices of the validation datasets, where the correct material type for a particular index range is indicated by the overlaid horizontal arrows. As shown in the figure, the predicted material vector shows good agreement with the material under processing. Almost 10 dB separation between the most probable material and the others shows that we can accurately predict the material under processing. The good material separation for various ablation depths suggests the existence of material-specific surface morphology for each groove, that this information is imprinted in the speckle pattern, and that the deep neural network can successfully extract this information.

 figure: Fig. 6.

Fig. 6. (a) Logits magnitude for various input frames for aluminum, copper, and nickel.

Download Full Size | PDF

Our method can also be applied for hole drilling. Figure 7 shows the predicted ablation volume as a function of measured ablation volume in percussion drilling. In this experiment, we used 500-μm-thick polished plates of silicon with as a target material. The same experimental setup was used without moving the stage during laser irradiation. Holes were drilled with various combinations of pulse energy and the number of pulses. The number of pulses ranged from 50 to 300 with a step size of 50, which was controlled by the mechanical shutter. The pulse energies were varied from 1 to 10 μJ. Ablated volumes of the drilled holes were measured using the 3D confocal microscope. Here, we employed an ablation volume for the prediction target instead of the ablation depth because the measurement uncertainty of percussion-drilled depth is rather large for silicon. We prepared the label data by interpolating the ablated volume as a function of pulse energy and the irradiated number of pulses. The image frames for the training and validation were prepared independently. Almost 100,000 datasets were used for the training and 1,000 datasets were used for the validation. The good correspondence between the predicted value and the actual value suggests that our method is also applicable for drilling. The discrepancy between actual and predicted ablation volumes above 500 mm3 is likely due to the experimental configuration. In this percussion drilling experiment, a part of the scattered light at the bottom of deeper holes could not reach the sensor plane of the high-speed camera as per Fig. 2. The discrepancy at larger ablation volumes would be suppressed if the speckle pattern was measured at a smaller diffraction angle.

 figure: Fig. 7.

Fig. 7. Deep-neural-network based prediction of ablation volume from laser speckle pattern during percussion drilling on a silicon substrate.

Download Full Size | PDF

5. Conclusion

We developed a method to extract the progress of laser processing by in situ speckle pattern analysis using a neural network. The neural network can predict the ablation depth of a groove with an accuracy of 2 μm. Our method is easy to install compared to traditional interferometric monitoring techniques. We also show that as the speckle information utilizes the full surface information, it can predict other properties of the processed material as well, for example, the processed material type. This would be especially useful for composite material processing. Moreover, the prediction time of less than 0.1 ms is promising for real-time feedback for complex processing procedures. Altogether, the simplicity, versatility, and overall accuracy of the method should make it a strong candidate for future integrated monitoring systems.

Appendix

 figure: Fig. 8.

Fig. 8. Typical input data after the preparation procedure described in Section 3. N represents the frame number during a data acquisition process. Copper was used as a target material.

Download Full Size | PDF

Funding

New Energy and Industrial Technology Development Organization (TACMI Project).

Disclosures

The authors declare no conflicts of interest.

References

1. B. N. Chichkov, C. Momma, S. Nolte, F. von Alvensleben, and A. Tunnermann, “Femtosecond, picosecond and nanosecond laser ablation of solids,” Appl. Phys. A 63(2), 109–115 (1996). [CrossRef]  

2. B. Y. Mueller and B. Rethfeld, “Nonequilibrium electron–phonon coupling after ultrashort laser excitation of gold,” Appl. Surf. Sci. 302, 24–28 (2014). [CrossRef]  

3. S. Nolte, C. Momma, H. Jacobs, A. Tunnermann, B. N. Chichkov, B. Wellegehausen, and H. Welling, “Ablation of metals by ultrashort laser pulses,” J. Opt. Soc. Am. B 14(10), 2716–2722 (1997). [CrossRef]  

4. M. Hashida, A. F. Semerok, O. Gobert, G. Petite, Y. Izawa, and J. F. Wagner, “Ablation threshold dependence on pulse duration for copper,” Appl. Surf. Sci. 197-198, 862–867 (2002). [CrossRef]  

5. A. Ancona, F. Röser, K. Rademaker, J. Limpert, S. Nolte, and A. Tunnermann, “High speed laser drilling of metals using a high repetition rate, high average power ultrafast fiber CPA system,” Opt. Express 16(12), 8958–8968 (2008). [CrossRef]  

6. S. Tani and Y. Kobayashi, “Pulse-by-pulse depth profile measurement of femtosecond laser ablation on copper,” Appl. Phys. A 124(3), 265 (2018). [CrossRef]  

7. H. Mustafa, D. T. A. Matthews, and G. R. B. E. Römer, “Influence of the pulse duration at near-infrared wavelengths on the laser-induced material removal of hot-dipped galvanized steel,” J. Laser Appl. 32(2), 022015 (2020). [CrossRef]  

8. C. E. Yeack, R. L. Melcher, and H. E. Klauser, “Transient photoacoustic monitoring of pulsed laser drilling,” Appl. Phys. Lett. 41(11), 1043–1044 (1982). [CrossRef]  

9. A. Sun, E. Kannatey-Asibu Jr, and M. Gartner, “Sensor systems for real-time monitoring of laser weld quality,” J. Laser Appl. 11(4), 153–168 (1999). [CrossRef]  

10. A. Stournaras and G. Chryssolouris, “On acoustic emissions in percussion laser drilling,” J. Opt. Soc. Am. B 46(5-8), 611–620 (2010). [CrossRef]  

11. A. Stournaras, K. Salonitis, and G. Chryssolouris, “Optical emissions for monitoring of the percussion laser drilling process,” J. Opt. Soc. Am. B 46(5-8), 589–603 (2010). [CrossRef]  

12. P. J. L. Webster, L. G. Wright, K. D. Mortimer, B. Y. Leung, J. X. Z. Yu, and J. M. Fraser, “Automatic real-time guidance of laser machining with inline coherent imaging,” J. Laser Appl. 23(2), 022001 (2011). [CrossRef]  

13. P. J. L. Webster, L. G. Wright, Y. Ji, C. M. Galbraith, A. W. Kinross, C. Van Vlack, and J. M. Fraser, “Automatic laser welding and milling with in situ inline coherent imaging,” Opt. Lett. 39(21), 6217–6220 (2014). [CrossRef]  

14. J. J. Blecher, C. M. Galbraith, C. Van Vlack, T. A. Palmer, J. M. Fraser, P. J. L. Webster, and T. DebRoy, “Real time monitoring of laser beam welding keyhole depth by laser interferometry,” Sci. Technol. Weld. Joining 19(7), 560–564 (2014). [CrossRef]  

15. Y. Ji, A. W. Grindal, P. J. L. Webster, and J. M. Fraser, “Real-time depth monitoring and control of laser machining through scanning beam delivery system,” J. Phys. D: Appl. Phys. 48(15), 155301 (2015). [CrossRef]  

16. C. Stadter, M. Schmoeller, L. von Rhein, and M. F. Zaeh, “Real-time prediction of quality characteristics in laser beam welding using optical coherence tomography and machine learning,” J. Laser Appl. 32(2), 022046 (2020). [CrossRef]  

17. F. P. Mezzapesa, V. Spagnolo, A. Ancona, and G. Scamarcio, “Detection of ultrafast laser ablation using quantum cascade laser-based sensing,” Appl. Phys. Lett. 101(17), 171107 (2012). [CrossRef]  

18. F. P. Mezzapesa, T. Sibillano, F. Di Niso, A. Ancona, P. M. Lugarà, M. Dabbicco, and G. Scamarcio, “Real time ablation rate measurement during high aspect-ratio hole drilling with a 120-ps fiber laser,” Opt. Express 20(1), 663–671 (2012). [CrossRef]  

19. A. G. Demir, B. Previtali, A. Magnani, A. Pesatori, and M. Norgia, “Application of self-mixing interferometry for depth monitoring in the ablation of TiN coatings,” J. Laser Appl. 27(S2), S28005 (2015). [CrossRef]  

20. D. Léger, E. Mathieu, and J. C. Perrin, “Optical Surface Roughness Determination Using Speckle Correlation Technique,” Appl. Opt. 14(4), 872–877 (1975). [CrossRef]  

21. T. D. Dudderar and P. G. Simpkins, “Laser speckle photography in a fluid medium,” Nature 270(5632), 45–47 (1977). [CrossRef]  

22. A. K. Dunn, H. Bolay, M. A. Moskowitz, and D. A. Boas, “Dynamic Imaging of Cerebral Blood Flow Using Laser Speckle,” J. Cereb. Blood Flow Metab. 21(3), 195–201 (2001). [CrossRef]  

23. I. Yamaguchi, “A laser-speckle strain gauge,” J. Phys. E: Sci. Instrum. 14(11), 1270–1273 (1981). [CrossRef]  

24. H. Yamamoto and M. Takeda, “Fourier-transform speckle profilometry: three-dimensional shape measurements of diffuse objects with large height steps and/or spatially isolated surfaces,” Appl. Opt. 33(34), 7829–7837 (1994). [CrossRef]  

25. M. Dekiff, P. Berssenbrügge, B. Kemper, C. Denz, and D. Dirksen, “Three-dimensional data acquisition by digital correlation of projected speckle patterns,” Appl. Phys. B 99(3), 449–456 (2010). [CrossRef]  

26. A. Krizhevsky, I. Sutskever, and G. E. Hinton, “ImageNet Classification with Deep Convolutional Neural Networks,” Commun. ACM 60(6), 84–90 (2017). [CrossRef]  

27. Y. LeCun, Y. Bengio, and G. Hinton, “Deep learning,” Nature 521(7553), 436–444 (2015). [CrossRef]  

28. L. Tian, Y. Xue, and Y. Li, “Deep speckle correlation: a deep learning approach toward scalable imaging through scattering media,” Optica 5(10), 1181–1190 (2018). [CrossRef]  

29. A. Turpin, I. Vishniakou, and J. D. Seelig, “Light scattering control in transmission and reflection with neural networks,” Opt. Express 26(23), 30911–30929 (2018). [CrossRef]  

30. O. L. Muskens, P. R. Wiecha, R. French, and U. Kürüm, “Deep learning enabled real time speckle recognition and hyperspectral imaging using a multimode fiber array,” Opt. Express 27(15), 20965–20979 (2019). [CrossRef]  

31. A. Ozcan, G. Barbastathis, and G. Situ, “On the use of deep learning for computational imaging,” Optica 6(8), 921–943 (2019). [CrossRef]  

32. F. Kalichman, O. Levitas, R. Jacobson, Z. Kalyzhner, and Z. Zalevsky, “Photonic human identification based on deep learning of back scattered laser speckle patterns,” Opt. Express 27(24), 36002–36010 (2019). [CrossRef]  

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (8)

Fig. 1.
Fig. 1. Process monitoring using laser speckle with a deep neural network.
Fig. 2.
Fig. 2. Experimental setup. The abbreviations are as follows: PH: pinhole, DG: delay generator, MS: mechanical shutter, DM: dichroic mirror, MC: micro-computer, HC: high-speed camera, and SPF: short pass filter.
Fig. 3.
Fig. 3. (a) Three-dimensional depth profile of a laser processed groove. (b) Cross-sectional of the depth profile. (c) Typical laser speckle images of three sequential frames.
Fig. 4.
Fig. 4. Structure of the neural network. CNN: Convolutional neural network. ReLU: Rectified linear unit. FC: Fully connected network. K: Kernel size. S: Stride size.
Fig. 5.
Fig. 5. Predicted ablation depth vs. measured ablation depth for aluminum, copper, and nickel.
Fig. 6.
Fig. 6. (a) Logits magnitude for various input frames for aluminum, copper, and nickel.
Fig. 7.
Fig. 7. Deep-neural-network based prediction of ablation volume from laser speckle pattern during percussion drilling on a silicon substrate.
Fig. 8.
Fig. 8. Typical input data after the preparation procedure described in Section 3. N represents the frame number during a data acquisition process. Copper was used as a target material.
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.