Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Data-driven multi-joint waveguide bending sensor based on time series neural network

Open Access Open Access

Abstract

Due to the bulky interrogation devices, traditional fiber optic sensing system is mainly connected by wire or equipped only for large facilities. However, the advancement in neural network algorithms and flexible materials has broadened its application scenarios to bionics. In this paper, a multi-joint waveguide bending sensor based on color dyed filters is designed to detect bending angles, directions and positions. The sensors are fabricated by casting method using soft silicone rubber. Besides, required optical properties of sensor materials are characterized to better understand principles of the sensor design. Time series neural networks are utilized to predict bending position and angle quantitatively. The results confirm that the waveguide sensor demodulated by the data-driven neural network algorithm performs well and can be used for engineering applications.

© 2023 Optica Publishing Group under the terms of the Optica Open Access Publishing Agreement

1. Introduction

Flexible sensing plays a vital role in soft robotics [1,2], fluid perception [3], human motion tracking [4] and medical devices [5]. With the development and application of flexible materials, the sensors based on patch-type capacitors or resistors can no longer meet the demand for softness [6]. Because such sensors are usually heavy and highly rigid, which would restrict or significantly affect the characteristics of the soft devices [2,7]. Fiber-optic shape sensors (FOSS) were first proposed by the end of the last century [8]. Compared with conventional strain measurement, FOSSs are soft, light in weight, immune to electromagnetic interference and corrosion resistant [911]. The most commercialized and widely used FOSS sensor is based on the Fiber Bragg Gratings (FBGs) sensor, which has been applied in structure health monitoring(SHM) of infrastructure, temperature monitoring and so on [12,13]. For fiber-optic sensors based on grating structure, the manufacturing process contains embedding precise gratings in optical fiber and it is complex and costly. In addition, it requires corresponding signal interrogation and high-precision spectral detection equipment [14]. Bulky and expensive optical sensing interrogation devices can neither be applied for real-time measurement nor small soft bionics in-situ bending sensing.

On one hand, researchers have tried to reduce the complexity of the optical circuits as well as the demodulation equipment by designing personalized optical waveguides based on light intensity variations to achieve soft sensing. In 2016, Zhao et al. integrated multiple beams of optical waveguides into a flexible robotic hand to detect the bending of fingers and the shape of the object touched by fingertips [1]. Teeple et al. have embedded soft optical waveguides in a soft robotic gripper as curvature and contact sensors for underwater use [15], but their sensors can only detect single action. In 2020, Bai et al. designed a distributed fiber-optic sensor(DFOS) based on a color-dyed filter and realized multi-joint action detection [13]. However, they did not quantitatively demodulate the sensing data because of its complexity. On the other hand, the miniaturization of optical spectrometers has been continuously progressed from three aspects, including miniaturized dispersive optics, tunable or arrayed narrowband filters, and Fourier transforms-based systems [16]. Waveguides and integrated optics have contributed to the rapid development of these traditional lab-on-chip methods [17]. Nevertheless, the lab-on-chip system is still costly and not mature. Besides, thermal problems have a greater impact on this highly sensitive lab-on-chip system. In recent years, with the broader applications and maturity of neural networks, advanced algorithms have been employed in optics to analyze and demodulate sensing data, simplifying the hardware [14]. The neural network provides an alternative solution in optics, such as optical imaging [1820], microscopy resolution enhancement [21,22], and multimode interference recognition [14], which significantly improves the efficiency of signal demodulation and analysis. Traditionally, sensing data are demodulated with a linear mapping relationship between the obtained signals and the variables under test. However, data-driven neural network models apply the nonlinear relationship between variables, which broadens the sensor interrogation methods [23]. For temporal sensing data, time series neural networks can extract relevant temporal features from the collected data to implement the modeling. Besides, abundant data can effectively reduce the influence of environmental and other factors based on the model. Time series neural networks have been widely used and verified in time series anomaly detection.

In this paper, a prototype of a flexible waveguide bending sensor based on an RGB (the red, green, and blue primary colors of light) color detector was fabricated. The RGB values are interrogated by a data-driven model based on time series neural network algorithm, which connects the two groups of nonlinearly correlated variables and simplifies the bulky interrogation system. The detailed working principles of the sensor are introduced in Section 2. Section 3 describes the manufacturing process of the waveguide, the experimental setup, and the structure of the time series neural network. Section 4 conducts a preliminary test and validates the sensor properties qualitatively. After that, the flexible waveguide sensor is integrated into soft holders for modeling and testing. The data are demodulated and analyzed by a time series neural network quantitatively. Finally, the conclusions are drawn and future work is identified.

2. Working principles

The waveguide sensor in this paper is designed based on color filters to realize distributed sensing by setting different dyed blocks in optical waveguides [13,24]. The absorption of light by substances includes universal and selective absorption. Universal absorption is usually independent of the light wavelength, while selective absorption is particularly strong for specific wavelengths of lights. Universal absorption absorbs little light and is almost constant. When the light beam passes through, only the light intensity changes instead of the color or wavelength. Conversely, during selective absorption, the light absorption by the substance is strong and changes dramatically with wavelength. Broad spectrum light will be filtered to colored light for selective absorption after passing through. For transparent media with color, the transmittance varies with the imaginary part of the complex refractive index. The absorption coefficient can be described as follows [25]:

$$\begin{array}{{c}} {\alpha = \frac{{4k\pi }}{\lambda }} \end{array}$$
which is derived from complex refractive index:
$$\begin{array}{{c}} {\tilde{n} = n + i\; k} \end{array}$$
where, k is the imaginary part of complex refractive index, which is related to selective absorption. $\lambda $ is the light wavelength. The intensity of transmitted light can be calculated by Beer-Lambert Law [26]:
$$\begin{array}{{c}} {I = \; {I_0}{e^{ - \alpha z}}} \end{array}$$
where I and ${I_0}$ denote light intensity, z is transmission distance. The longer the light propagates, the lower the light intensity is.

Colorless transparent substances generally absorb less visible light, while colored transparent substances have a strong selective absorption effect on visible light. When visible light propagates in a colorless transparent waveguide core, its spectrum should be constant under ideal conditions. However, if a dyed block like red is set in the waveguide, as shown in Fig. 1, part of the transmitted light will be absorbed, and the red-light intensity will increase due to the selective absorption effect. When the waveguide is bent at the dyed block’s position, the upper part of the waveguide is stretched, and the lower part is squeezed, which will change the distribution of light and the light propagation path in the core. Among the entering lights in the red-dyed block, due to the waveguide bending at the dyed position, as marked in Fig. 1(b), a great amount of red-light escapes from the waveguide. Moreover, some light will also reflect inside the dyed block and attenuate owing to the difference in refractive index between the core and cladding materials. In addition to the transmission loss, the red-light intensity decreases significantly during waveguide bending. Therefore, the bending angles can be predicted by analyzing the content of various light colors in the output light with a suitable interrogation method. If multiple dyed blocks with different colors are set in a single waveguide, it is possible to determine the bending angle and position of the waveguide by observing color variations of the output lights.

 figure: Fig. 1.

Fig. 1. The propagation path of visible lights in red-dyed waveguide: (a) straight waveguide. (b) waveguide bent at the red-dyed block.

Download Full Size | PDF

3. Material and methods

3.1 Design and manufacture

The waveguide is manufactured according to the process flow and dimensions shown in Fig. 2(a). The dyed blocks can be freely arranged in the waveguide. In our work, considering the size of the three connected drive servos in Fig. 3, we set the length of the waveguide to 120 mm. The cross-section of the waveguide core is also flexible. However, some factors should be considered first before scaling the cross-section. For example, a larger cross-section will lead to more light attenuation, while a smaller cross-section will make the sensor hard to fabricate and prone to breakage. Here, considering the size of the LED chip, the waveguide cross-section is determined to be a square with 3.5mm × 3.5 mm because this size can cover the LED chip and collect all light. Besides, the dimensions of dyed blocks have significant influences on color proportions in transmitted light. According to previous work and our experimental results, the dyed PDMS block is set to be a 2mm × 2mm × 6 mm cube [13,27]. To study the influence of DFOS dimensions on the performance of sensors, we measure the intensity variations of transmitted light travel through dyed blocks with different thicknesses during waveguide bending, as shown in Fig. 2(b). The thicker the dyed block is, the less light can travel through it. However, significant variations during bending are better for sensing. Therefore, the thickness of the dyed block is determined to be 2 mm. In other conditions, the dimensions should be set smaller if the light intensity is not sufficient or more dyed blocks need to be added.

 figure: Fig. 2.

Fig. 2. Waveguide design, manufacture and tests: (a) Dimension in millimeters (left) and manufacturing process (right) of the waveguide core for sensor. (b) Normalized intensity of red dyed PDMS blocks with different thickness when waveguide bending. (c) Power attenuation of transparent waveguide during transmission.

Download Full Size | PDF

 figure: Fig. 3.

Fig. 3. Experimental bending system for data collection.

Download Full Size | PDF

The right part of Fig. 2(a) shows the manufacturing of a waveguide sensor involves four steps. Firstly, the mold for transparent part is fabricated by a 3D printer (Form 3, Formlabs). Then, the two components of PDMS (Polydimethylsiloxane, SYLGARD 184) are mixed in the ratio of 10:1 are poured into the mold. After degassing in the vacuum, the PDMS is cured in an 80 °C oven for 2 hours. The pure PDMS is demolded after being cured. At last, the dyed PDMS is injected into the cavities of cured PDMS and then cured in an 80 °C oven for 2 hours. As a result, the fabrication of the waveguide core is completed. The manufacturing process of the waveguide cladding is similar and the cladding material is another type of silicone rubber (DragonSkin 10). After manufacture, we conducted a test to measure the waveguide transmission loss. Transmitted light is measured at 20 mm, 40 mm, 60 mm, 80 mm, 100 mm and 120 mm, respectively. The intensity of transmitted light is the reference value, which is the output from a 20 mm waveguide with no dyed block. As Fig. 2(c) shows, the attenuation of light intensity in transparent waveguide is linearly related to the waveguide length and its attenuation is ∼1.2 dB/cm.

3.2 Experimental setup

Figure 3 shows the structure and connection diagram of the entire experimental system, where the three-joint waveguide is driven by three servos. The system includes a light source(WS2812B), an RGB color detector (TCS3472) and three servos(SM40BL). The visible light source WS2812B is a programmable chip integrating RGB color lights. The RGB color detector is a 3-channel color sensor that can detect RGB values, respectively. It communicates with the controller by Inter-Integrated Circuit (I2C). The servos are produced with high-resolution feedback sensors by FeeTech so that the training data and labels can be collected by this system. Besides, the servos need an external power supply and a driver board (URT-1) to realize communication and operation. The whole system is controlled by a micro controller board (Arduino Nano 33 BLE Sense). The inset figure at top left in Fig. 3 shows the photo of the three joints bending structure.

3.3 Time series neural network

Time series neural networks have proven applicability in the fields of speech recognition [28], text generation [29], anomaly detection [30,31] etc. Compared with convolutional neural networks (CNN), time series neural network combines output of the previous step and current data to extract time-dependent information features [32,33]. In this paper, we use the deep learning toolbox in Matlab, which trains dynamic neural networks to solve nonlinear time series problems. The nonlinear input-output network model predicts the output y(t) by building a model based on input data x(t), as shown in Fig. 4. The RGB values obtained from color detector are fed into a two-layer model. The bending angles of the waveguide sensor at three different positions are trained as labels and then predicted. The application in Matlab provides three training methods, including Levenberg-Marquardt, Bayesian Regularizations and Scaled Conjugate Gradient. Among the three iterative techniques, Levenberg-Marquardt runs fast, Bayesian Regularization trains slow but has better effects, and Scaled Conjugate Gradient is memory efficient for large problems [34,35].

 figure: Fig. 4.

Fig. 4. Algorithm structure of time series neural network.

Download Full Size | PDF

4. Results

4.1 Material property

PDMS is an organic polymeric material widely used in many fields, such as soft robotics and 3D print [2,36,37]. It is a common material for soft optical waveguides due to its low transmission loss for visible light [38]. In the visible spectrum, PDMS with a refractive index of about 1.42 is used as the waveguide core and DragonSkin 10 with a refractive index of about 1.41 as the waveguide cladding [13,39]. With the core and cladding, a total internal reflection of light occurs due to the difference in refractive index. The transmittance of dyed PDMS is measured to quantify the absorption.

As shown in Fig. 5(a), the measuring system consists of a visible laser source, visible spectrometer, integrating sphere, displacement stage and holders. The attenuator adjusts the intensity of the input light and the integrating sphere reduces the minor errors caused by uneven distribution or beam shift of the incident light source on the detector during measurement. The inset in Fig. 5(a) shows the separate PDMS blocks for transmittance measurements, including red, green, blue, yellow and transparent PDMS blocks. The thickness of the blocks is 1 mm. The PDMS blocks are held by the displacement stage so that lights can travel through them. Figure 5(b) shows the normalized transmittances. The normalized transmittances are calculated by comparing the incident light and the transmitted light. Within visible spectrum, light decays less at their respective wavelength when pass through PDMS blocks dyed by different colors.

 figure: Fig. 5.

Fig. 5. (a) Experimental setup for the measurement of dyed PDMS transmittance. (b) Dyed PDMS blocks and their normalized transmittance. (c) LED light source - WS2812B chips, its spectral power distribution and transmittance on DragonSkin 10.

Download Full Size | PDF

The light source in practice is WS2812B, as shown in Fig. 5(c). Due to the difference in power supply voltages, the output intensity of the light source could slightly differ from the datasheet. Therefore, its spectral power distribution is also measured with the same experimental setup and the result is shown in Fig. 5(c). Besides, the waveguide cladding is measured with the WS2812B and RGB color detector which are used in practical application. The transmittance is shown in the inset of Fig. 5(c), which is normalized by comparing the light intensity in the presence and absence of the cladding.

To prove the mechanical repeatability of the experimental setup, a long-term repetitive experiment is carried out. The servos drive a waveguide with a single red-dyed block and a waveguide with 3 dyed blocks to bend between ±30°, respectively. At the same time, the color detector collects stable RGB signals. As shown in Fig. 6, both groups of signals are stable during the tests lasting for over 20 minutes, indicating the robustness of our system.

 figure: Fig. 6.

Fig. 6. Repetitive experiments for the waveguide sensors (a) Waveguide with a red-dyed block (b) Waveguide with RGB dyed blocks.

Download Full Size | PDF

4.2 Qualitative analysis and validation

Before the application of the sensor, the waveguides are qualitatively tested and analyzed. Waveguides without dyed block, with only one dyed block at the middle joint and with multiple dyed blocks are tested under the servos’ drive, respectively. The servos drive the waveguides to bend at the dyed positions, and the bending angle varies between ± 30°, as shown in Fig. 7 and Fig. 8.

 figure: Fig. 7.

Fig. 7. Waveguide bend at the middle joint: (a) Bending angle drive by servo. Waveguide bending without dyed block: (b) RGB values (c) RGB color ratios. Waveguide bending with a red dyed block: (d) RGB values (e) RGB color ratios. Waveguide with four dyed blocks: (f) RGB values (g) RGB color ratios.

Download Full Size | PDF

 figure: Fig. 8.

Fig. 8. Waveguide bending at joints with RGB dyed blocks: (a) Bending angles (b) RGB values (c) RGB color ratios.

Download Full Size | PDF

For single bending, Fig. 7(a) shows that the servo drives the clear waveguide bends at the middle joint between ±30°. For waveguides without dyed blocks, the waveguide bending angle and the corresponding information are shown in Fig. 7(b) and 7(c). The detected RGB values are shown in Fig. 7(b). The calculated color ratios are shown in Fig. 7(c). Due to the power distribution of the light source as shown in Fig. 5(c), blue light intensity is about twice as strong as red light and three times as strong as green light. In Fig. 7(b), the light intensity of the waveguide decreases as the bending angle increases. Theoretically, the RGB color ratio will not change in the waveguide without dyed blocks. However, Fig. 7(c) shows a slight fluctuation of the color ratios during the bending. This is because the waveguide cladding reflects and absorbs RGB colors differently during transmission and reflection at the junction.

Figure 7(d) and 7(e) show the RGB values and ratios of a red-dyed single block waveguide changes with angle variations. Since only one red-dyed block is set in the waveguide, red light intensity is greater than the blue and green lights as shown in Fig. 7(d). As mentioned in Section 2, the bending of waveguide leads to more light leakages at the bending positions and more reduction of the red light. Furthermore, at the bending position, blue light leaks much more than red and green due to the transmittance of the waveguide cladding. Therefore, Fig. 7(e) shows that the RGB color ratios vary with the bending of the waveguide at the middle joint.

Figure 7(f) and 7(g) describe the intensity changes for a waveguide with four color-dyed blocks, including red, green, blue and yellow. Yellow light is composed of green and red light. The yellow dyed block works as a color filter. As the red block affects red light, red and green light travel through yellow block at a fixed ratio. The yellow dyed block lies in the middle joint for the bending experiment. Due to the heavy transmission loss and four absorption dyed blocks, the curves appear not smooth and the intensity variations during bending are small, as shown in Fig. 7(f). However, when the waveguide bends, the relative color ratios in Fig. 7(g) change significantly. Because bending leads to leakage of different color light. The rest three dyed blocks, other than yellow, will not affect the sensing results if they are not bent. But when more than three different color dyed blocks are used for sensing simultaneously, the signals become complicated and the color detector needs to be updated to more channels of wavelengths like the spectral sensor(ACS7341), which has 11 channels for multi-spectral sensing.

Waveguide with three dyed blocks is bent alternatively by the three servos, the three lines in Fig. 8(a) represent the bending angles of the waveguide. Figure 8(b) records the variations of the RGB values from the color detector and Fig. 8(c) shows the corresponding color ratios. As the waveguide is bent, the light intensity of different wavelengths is reduced due to leakage. Figure 8(b) shows that the intensity of red light is greater than those of blue and green light, which is related to the material of the waveguide, the color of the dyed blocks and their transparency, the flatness of the waveguide's end surfaces, and light coupling method [40]. From the perspectives of Figs. 8(a) and (b), the first bending of the waveguide is at the red-dyed position (Servo-1), which causes a significant drop in red. The intensities of the blue and green lights reduce because of the bending leakages. Afterward, the bending at the green-dyed position(Servo-2) leads to great reduction in green, though its color ratio increases as shown in Fig. 8(c). In the end, blue light intensity changes only slightly during the bending at the blue position(Servo-3). Because the blue-dyed block is the farthest from the color detector, the light passing through the blue-dyed block will travel through the green and red dyed blocks again to reach the color detector. In addition, the transmission loss after a certain distance will also lead to the light intensity reduction.

In addition to the trend of the curves in Figs. 7 and 8, there are still some small fluctuations and harmonics despite careful manufacturing and experimentation. Krauss et al. have listed some manufacturing inaccuracy reasons, including material mixing ratios, light source and detector placement, tiny air bubbles in waveguides and smoothness of the waveguide surface [41]. Besides, for the waveguide in this paper, the colorfulness of the dyed blocks, the stability during data collection and the quality of the light source will also influence the sensing results. Fortunately, through adequate training and modeling by data-driven neural network algorithm, the vast majority of influences and inaccuracies from environment can be identified and excluded.

4.3 Modelling and test

In this section, a single bend waveguide sensor and a multi-joint waveguide sensor are embedded in soft holders and tested independently. During the experiments, the data obtained from the feedback servos and the output RGB values of the sensors are collected and used for sensor modeling. After that, additional data will be collected and predicted with the sensing model. The predicted values and actual values will be compared, and their autocorrelation coefficient ${\rho _{X,Y}}$ will be calculated:

$$\begin{array}{{c}} {{\rho _{X,Y}} = \frac{{cov({X,Y} )}}{{{\sigma _X}{\sigma _Y}}}} \end{array}$$

In which, $cov$ is the covariance of X and Y, ${\sigma _X}$ is the standard deviation of X, and ${\sigma _Y}$ is the standard deviation of Y.

For the single bend waveguide sensor, the Servo-2 at the middle joint is controlled to operate to a random angle within 30° in both directions. During the experiments, 15,000 data points are collected, including the bending angles(label) and the RGB values(data). The model is then built with the time series neural network. 70% of the data is used for training with the Levenberg-Marquardt method, and 30% data are used for validation and test, respectively. As a result, the correlation coefficients for training, validation and test correspond to 0.9911, 0.9784 and 0.9790 respectively. Additionally, another 2,000 data points about 100 seconds are collected for an application test. Before the test, the waveguide sensor is dismounted from the holders and twisted to bring some environmental disturbances into the test. The results show in Fig. 9(a), the red line indicates a piece of random bending angles of the servo-2 within 100 seconds, and the corresponding predicted values are described in the blue dotted line. Generally, the predicted values can follow the actual value well. The correlation coefficient for the application test is 0.9743, which is good enough for engineering applications. The most significant prediction errors lie in sudden changes in the bending angles, which could result from the lag in prediction. Moreover, through error statistics and analysis, over 65.18% of the data prediction errors are under 3°.

 figure: Fig. 9.

Fig. 9. Comparison between the actual and predicted value of the sensor after modelling by time series neural networks: (a) Bending angles of waveguide sensor with a red-dyed block (b)(c) Bending angles of waveguide sensor with 3 dyed blocks and its enlarged figure (d) Photo of the waveguide sensor.

Download Full Size | PDF

The multi-joint waveguide sensor is integrated in the holder and the servos drive the three joints to bend randomly at the same time. The red, green and blue dyed blocks are respectively deployed from joint 1 to joint 3. During the test, 30,000 data points are recorded when the servos operate randomly between ±20° instead of ±30° due to higher complexity. For better prediction accuracy, the interrogation model is trained with Bayesian Regularization, though it generates slower. Meanwhile, 85% of the data are used for training, and 15% is for the test. As a result, the correlation coefficient for training achieves 0.9175 while 0.9073 for the test. Similarly, an application test is carried out and acquired 2,000 data points with the same sensor. The results of the application test are shown in Fig. 9(b). By zooming in, it can be seen that the errors between the predicted and actual values in Fig. 9(c) are not significant. And the correlation coefficient of the application result is 0.9016. When the correlation coefficient reaches 0.9, the sensor model can be applied to engineering [42]. Figure 9(d) shows the real state of the waveguide sensor at 62nd second.

5. Summary and conclusions

In this paper, a soft bending sensor based on a PDMS waveguide is manufactured. The time series neural network algorithm quantitatively models and demodulates the multi-joint waveguide bends from RGB color signals. During the experiments, the correlation coefficient of the predicted and actual values for an additional test can achieve 0.9743 for a single joint bend and 0.9016 for a multi-joint bend. Recent researches on soft bending sensors based on optical waveguide are summarized in Table 1. In comparison, we optimize the distributed waveguide sensor and demodulate the signal quantitatively with time series neural network to detect the bending angles, directions and positions at the same time.

Tables Icon

Table 1. Comparison of optic based stretchable bending sensors

However, unlike traditional low-loss optical fibers, the transmission loss of light in flexible waveguides directly limits the length of the sensor. The reduction of transmission losses in optical waveguides requires multidisciplinary research and cooperation. On one hand, improvements in the fabrication of waveguides are needed. On the other hand, developments of flexible materials for optics are required. Besides, variable wavelength light source and detectors with more wavelength channels can be added to collect more information, which would be conducive to data-based demodulation. In future researches, our efforts will be made to optimize the sensing model and data collection method, combining neural network and time series anomaly detection algorithms to improve the application range and the robustness of the model.

Funding

Guangdong Provincial Key R&D Program of 2021 Ocean Six Industrial Project (2021-45); Scientific Research Funding Project of Westlake University (2021WUFP017); the Startup funding of New-joined PI of Westlake University (041030150118).

Acknowledgments

The authors thank help from Intelligent Micro/Nano Manufacturing Lab, ${\textrm{i}^4} - \textrm{FSI}$ Lab and PAINT Lab of Westlake University.

Disclosures

The authors declare no conflicts of interest.

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

References

1. H. Zhao, K. O’Brien, S. Li, and R. F. Shepherd, “Optoelectronically innervated soft prosthetic hand via stretchable optical waveguides,” Sci. Robot. 1(1), eaai7529 (2016). [CrossRef]  

2. P. Polygerinos, N. Correll, S. A. Morin, B. Mosadegh, C. D. Onal, K. Petersen, M. Cianchetti, M. T. Tolley, and R. F. Shepherd, “Soft robotics: Review of fluid-driven intrinsically soft devices; manufacturing, sensing, control, and applications in human-robot interaction,” Adv. Eng. Mater. 19(12), 1700016 (2017). [CrossRef]  

3. F. E. Fish, C. M. Schreiber, K. W. Moored, G. Liu, H. Dong, and H. Bart-Smith, “Hydrodynamic performance of aquatic flapping: efficiency of underwater flight in the manta,” Aerospace 3(3), 20 (2016). [CrossRef]  

4. S. Z. Homayounfar and T. L. Andrew, “Wearable Sensors for Monitoring Human Motion: A Review on Mechanisms, Materials, and Challenges,” SLAS Technol. 25(1), 9–24 (2020). [CrossRef]  

5. M. Xie, K. Hisano, M. Zhu, T. Toyoshi, M. Pan, S. Okada, O. Tsutsumi, S. Kawamura, and C. Bowen, “Flexible Multifunctional Sensors for Wearable and Robotic Applications,” Adv. Mater. Technol. 4(3), 1800626 (2019). [CrossRef]  

6. H. Xing-Yu and G. Chuan-Fei, “Sensing mechanisms and applications of flexible pressure sensors,” Acta Phys. Sin.69, (2020).

7. F. Pena, L. Richards, A. R. Parker Jr, A. Piazza, P. Chan, and P. Hamory, “Fiber Optic Sensing System (FOSS) technology-A new sensor paradigm for comprehensive structural monitoring and model validation throughout the vehicle life-cycle,” (2015).

8. M. Miller, K. Murphy, A. Vengsarkar, and R. Claus, “Fiber optic shape sensing for flexible structures,” in Fiber Optic Smart Structures and Skins II, (SPIE, 1990), 399–404.

9. M. Ramakrishnan, G. Rajan, Y. Semenova, and G. Farrell, “Overview of Fiber Optic Sensor Technologies for Strain/Temperature Sensing Applications in Composite Materials,” Sensors 16(1), 99 (2016). [CrossRef]  

10. I. Floris, J. M. Adam, P. A. Calderón, and S. Sales, “Fiber optic shape sensors: A comprehensive review,” Optics and Lasers in Engineering 139, 106508 (2021). [CrossRef]  

11. M. Amanzadeh, S. M. Aminossadati, M. S. Kizil, and A. D. Rakić, “Recent developments in fibre optic shape sensing,” Measurement 128, 119–137 (2018). [CrossRef]  

12. P. Lu, N. Lalam, M. Badar, B. Liu, B. T. Chorpening, M. P. Buric, and P. R. Ohodnicki, “Distributed optical fiber sensing: Review and perspective,” Appl. Phys. Rev. 6(4), 041302 (2019). [CrossRef]  

13. H. Bai, S. Li, J. Barreiros, Y. Tu, C. R. Pollock, and R. F. Shepherd, “Stretchable distributed fiber-optic sensors,” Science 370(6518), 848–852 (2020). [CrossRef]  

14. K. Sun, Z. Ding, and Z. Zhang, “Fiber directional position sensor based on multimode interference imaging and machine learning,” Appl. Opt. 59(19), 5745–5751 (2020). [CrossRef]  

15. C. B. Teeple, K. P. Becker, and R. J. Wood, “Soft curvature and contact force sensors for deep-sea grasping via soft optical waveguides,” in 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), (IEEE, 2018), 1621–1627.

16. Z. Yang, T. Albrow-Owen, W. Cai, and T. Hasan, “Miniaturization of optical spectrometers,” Science 371(6528), eabe0722 (2021). [CrossRef]  

17. B. Gao, Z. Shi, and R. W. Boyd, “Design of flat-band superprism structures for on-chip spectroscopy,” Opt. Express 23(5), 6491–6496 (2015). [CrossRef]  

18. S. Li, M. Deng, J. Lee, A. Sinha, and G. Barbastathis, “Imaging through glass diffusers using densely connected convolutional networks,” Optica 5(7), 803–813 (2018). [CrossRef]  

19. B. Rahmani, D. Loterie, G. Konstantinou, D. Psaltis, and C. Moser, “Multimode optical fiber transmission with a deep learning network,” Light: Sci. Appl. 7(1), 69 (2018). [CrossRef]  

20. N. Borhani, E. Kakkava, C. Moser, and D. Psaltis, “Learning to see through multimode fibers,” Optica 5(8), 960–966 (2018). [CrossRef]  

21. Y. Rivenson, Z. Göröcs, H. Günaydin, Y. Zhang, H. Wang, and A. Ozcan, “Deep learning microscopy,” Optica 4(11), 1437–1443 (2017). [CrossRef]  

22. E. Nehme, L. E. Weiss, T. Michaeli, and Y. Shechtman, “Deep-STORM: super-resolution single-molecule microscopy by deep learning,” Optica 5(4), 458–464 (2018). [CrossRef]  

23. S. Arridge, P. Maass, O. Öktem, and C.-B. Schönlieb, “Solving inverse problems using data-driven models,” Acta Numerica 28, 1–174 (2019). [CrossRef]  

24. S.-W. Wang, C. Xia, X. Chen, W. Lu, M. Li, H. Wang, W. Zheng, and T. Zhang, “Concept of a high-resolution miniature spectrometer using an integrated filter array,” Opt. Lett. 32(6), 632–634 (2007). [CrossRef]  

25. G. Giusfredi, Physical Optics: Concepts, Optical Elements, and Techniques (Springer Nature, 2019).

26. T. G. Mayerhofer, S. Pahlow, and J. Popp, “The Bouguer-Beer-Lambert Law: Shining Light on the Obscure,” Chemphyschem 21(18), 2029–2046 (2020). [CrossRef]  

27. J. Guo, X. Liu, N. Jiang, A. K. Yetisen, H. Yuk, C. Yang, A. Khademhosseini, X. Zhao, and S. H. Yun, “Highly stretchable, strain sensing hydrogel optical fibers,” Adv. Mater. 28(46), 10244–10249 (2016). [CrossRef]  

28. J. Li, “Recent advances in end-to-end automatic speech recognition,” APSIPA Transactions on Signal and Information Processing 11(2022).

29. T. Iqbal and S. Qureshi, “The survey: Text generation models in deep learning,” Journal of King Saud University – Computer and Information Sciences 34(6), 2515–2528 (2022). [CrossRef]  

30. G. Pang, C. Shen, L. Cao, and A. V. D. Hengel, “Deep learning for anomaly detection: A review,” ACM Comput. Surv. 54(2), 1–38 (2022). [CrossRef]  

31. A. Blázquez-García, A. Conde, U. Mori, and J. A. Lozano, “A review on outlier/anomaly detection in time series data,” ACM Comput. Surv. 54(3), 1–33 (2022). [CrossRef]  

32. B. W. K. Ang and C.-H. Yeow, “A Learning-Based Approach to Sensorize Soft Robots,” Soft Robotics (2022).

33. J. F. Torres, D. Hadjout, A. Sebaa, F. Martínez-Álvarez, and A. Troncoso, “Deep learning for time series forecasting: a survey,” Big Data 9(1), 3–21 (2021). [CrossRef]  

34. “Neural Net Time Series” (MathWorks, 2022), retrieved 22nd, November, 2022, https://ww2.mathworks.cn/help/deeplearning/ref/neuralnettimeseries-app.html.

35. M. I. Lourakis, “A brief description of the Levenberg-Marquardt algorithm implemented by levmar,” Foundation of Research and Technology 4, 1–6 (2005).

36. J. Feng, Y. Zheng, Q. Jiang, M. K. Włodarczyk-Biegun, S. Pearson, and A. del Campo, “Elastomeric Optical Waveguides by Extrusion Printing,” Adv. Mater. Technol. 7(10), 2101539 (2022). [CrossRef]  

37. Z. Wang, B. Zhang, W. Cui, and N. Zhou, “Freeform Fabrication of Pneumatic Soft Robots via Multi-Material Jointed Direct Ink Writing,” Macromol. Mater. Eng. 307(4), 2270015 (2022). [CrossRef]  

38. D. K. Cai, A. Neyer, R. Kuckuk, and H. M. Heise, “Optical absorption in transparent PDMS materials applied for multimode waveguides fabrication,” Opt. Mater. (Amsterdam, Neth.) 30(7), 1157–1161 (2008). [CrossRef]  

39. “SYLGARD™ 184 Silicone Elastomer” (The Dow Chemical Company, 2017), retrieved https://www.dow.com/content/dam/dcc/documents/en-us/productdatasheet/11/11-31/11-3184-sylgard-184-elastomer.pdf.

40. V. Prajzler, M. Neruda, and M. Květoň, “Flexible multimode optical elastomer waveguides,” J Mater Sci: Mater Electron 30(18), 16983–16990 (2019). [CrossRef]  

41. H. Krauss and K. Takemura, “Stretchable Optical Waveguide Sensor Capable of Two-Degree-of-Freedom Strain Sensing Mediated by a Semidivided Optical Core,” IEEE/ASME Trans. Mechatron. 27(4), 2151–2157 (2022). [CrossRef]  

42. H. Chen, W. Li, W. Cui, P. Yang, and L. Chen, “Multi-Objective Multidisciplinary Design Optimization of a Robotic Fish System,” Journal of marine science and engineering 9(5), 478 (2021). [CrossRef]  

43. W. Chen, C. Xiong, C. Liu, P. Li, and Y. Chen, “Fabrication and dynamic modeling of bidirectional bending soft actuator integrated with optical waveguide curvature sensor,” Soft robotics 6(4), 495–506 (2019). [CrossRef]  

44. A. Leber, B. Cholst, J. Sandt, N. Vogel, and M. Kolle, “Stretchable thermoplastic elastomer optical fibers for sensing of extreme deformations,” Adv. Funct. Mater. 29(5), 1802629 (2019). [CrossRef]  

45. H. Liang, Y. He, M. Chen, L. Jiang, Z. Zhang, X. Heng, L. Yang, Y. Hao, X. Wei, and J. Gan, “Self-powered stretchable mechanoluminescent optical fiber strain sensor,” Advanced Intelligent Systems 3(9), 2100035 (2021). [CrossRef]  

46. Z. Shen, Y. Zhao, H. Zhong, K. Tang, Y. Chen, Y. Xiao, J. Yi, S. Liu, and Z. Wang, “Soft origami optical-sensing actuator for underwater manipulation,” Front. Robot. AI 7, 616128 (2021). [CrossRef]  

47. T. Kim, S. Lee, T. Hong, G. Shin, T. Kim, and Y.-L. Park, “Heterogeneous sensing in a multifunctional soft sensor for human-robot interfaces,” Sci. Robot. 5(49), eabc6878 (2020). [CrossRef]  

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (9)

Fig. 1.
Fig. 1. The propagation path of visible lights in red-dyed waveguide: (a) straight waveguide. (b) waveguide bent at the red-dyed block.
Fig. 2.
Fig. 2. Waveguide design, manufacture and tests: (a) Dimension in millimeters (left) and manufacturing process (right) of the waveguide core for sensor. (b) Normalized intensity of red dyed PDMS blocks with different thickness when waveguide bending. (c) Power attenuation of transparent waveguide during transmission.
Fig. 3.
Fig. 3. Experimental bending system for data collection.
Fig. 4.
Fig. 4. Algorithm structure of time series neural network.
Fig. 5.
Fig. 5. (a) Experimental setup for the measurement of dyed PDMS transmittance. (b) Dyed PDMS blocks and their normalized transmittance. (c) LED light source - WS2812B chips, its spectral power distribution and transmittance on DragonSkin 10.
Fig. 6.
Fig. 6. Repetitive experiments for the waveguide sensors (a) Waveguide with a red-dyed block (b) Waveguide with RGB dyed blocks.
Fig. 7.
Fig. 7. Waveguide bend at the middle joint: (a) Bending angle drive by servo. Waveguide bending without dyed block: (b) RGB values (c) RGB color ratios. Waveguide bending with a red dyed block: (d) RGB values (e) RGB color ratios. Waveguide with four dyed blocks: (f) RGB values (g) RGB color ratios.
Fig. 8.
Fig. 8. Waveguide bending at joints with RGB dyed blocks: (a) Bending angles (b) RGB values (c) RGB color ratios.
Fig. 9.
Fig. 9. Comparison between the actual and predicted value of the sensor after modelling by time series neural networks: (a) Bending angles of waveguide sensor with a red-dyed block (b)(c) Bending angles of waveguide sensor with 3 dyed blocks and its enlarged figure (d) Photo of the waveguide sensor.

Tables (1)

Tables Icon

Table 1. Comparison of optic based stretchable bending sensors

Equations (4)

Equations on this page are rendered with MathJax. Learn more.

α = 4 k π λ
n ~ = n + i k
I = I 0 e α z
ρ X , Y = c o v ( X , Y ) σ X σ Y
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.