Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Restoration of longitudinal laser tomography target image from inhomogeneous medium degradation under common conditions

Open Access Open Access

Abstract

In order to overcome the shortages of the target image restoration method for longitudinal laser tomography using self-calibration, a more general restoration method through backscattering medium images associated with prior parameters is developed for common conditions. The system parameters are extracted from pre-calibration, and the LIDAR ratio is estimated according to the medium types. Assisted by these prior parameters, the degradation caused by inhomogeneous turbid media can be established with the backscattering medium images, which can further be used for removal of the interferences of turbid media. The results of simulations and experiments demonstrate that the proposed image restoration method can effectively eliminate the inhomogeneous interferences of turbid media and achieve exactly the reflectivity distribution of targets behind inhomogeneous turbid media. Furthermore, the restoration method can work beyond the limitation of the previous method that only works well under the conditions of localized turbid attenuations and some types of targets with fairly uniform reflectivity distributions.

© 2017 Optical Society of America

1. Introduction

Range-gated laser imaging [1] is a prominent technique for underwater detection [2–4], target recognition [5,6], and three-dimensional imaging [7–9], which can effectively improve the signal-to-noise ratio (SNR) of target images by removing backscattering noises. Unfortunately, the layer-cut target images still suffer from inhomogeneous degradations when the targets are obscured by inhomogeneous medium layers, and the inhomogeneous influence is difficult to be perfectly removed through blind image restoration algorithms such as the filtering algorithms [10–14] and the total variation methods [15–19].

We developed a longitudinal laser tomography (LLT) system [20,21] to capture both the target images and the backscattering images of turbid media by longitudinally scanning along the laser transmission path to form image sequences. As the photons backscattered from turbid medium clusters usually carry plentiful physical information of the turbid media, the degraded target images can be restored by using the backscattering image through an image restoration method [21]. Our previous work extracts prior information with help of the target image, and consequently it has the limitation that it only works well under the conditions of localized turbid distributions and fairly uniform targets.

Herein, we develop a new image restoration method for the LLT, which can estimate the degradation caused by inhomogeneous turbid media by using prior parameters and backscattering medium images. Furthermore, as the new method extracts prior information from pre-calibration rather than from a target image, it can realize an unsupervised restoration process, and meanwhile overcome the limitation of the previous method. Even if the turbid medium is spread out over the field of view and the target is highly variable spatially, the image restoration method can work properly.

2. Principles of image restoration

The schematic diagram of a LLT system is shown in Fig. 1. The transceiver system consists of a nanosecond pulsed laser and an intensified charge-coupled device (ICCD) camera. By finely tuning gate delays of the ICCD camera, echo signals from various ranges will be captured by the ICCD camera to generate image sequences, including both the target image and the backscattering image of the turbid medium. Based on the backscattering images together with some prior parameters, the degradation of the target image will be estimated for further image restoration.

 figure: Fig. 1

Fig. 1 Schematic diagram of the LLT system.

Download Full Size | PDF

The degradation model of the degraded target image can be simplified as

Ud=VU+N.
Here, Udrepresents the degraded target image, U represents the ideal image without degeneration, V is the degradation matrix determined by turbid media along the laser transmission path, and N is the additive noise of the system respectively [20]. The key point of the target image restoration is the estimation of the degradation matrixV.

According to Ref [21], when the gate range for the turbid medium is set to cover the entire layer of the turbid medium as shown in Fig. 1, a certain element (i,j) of the degradation matrix V corresponding to the target image pixel Itar(i,j) can be simplified as:

V(i,j)=exp[2SρS(x,y)].
Here, ρS is the reflectivity of the entire turbid medium layer; Sis the Light Detection and Ranging (LIDAR) ratio [22] of the turbid medium, i.e., the extinction-to-backscattering ratio S=Ke/β, whereKeand βare respectively the extinction coefficient and the backscattering coefficient of the turbid medium. The LIDAR ratio S depends on the complex refractive index and the size distribution of the medium particles rather than the total number density of the particles. In other words, for a given medium, its LIDAR ratio will remain constant even when the medium particles get spread or gathered.

Herein, the reflectivity ρS of the turbid medium can be calibrated by a reference target with uniform reflectivity ρ0¯ that can be pre-measured. Basically, the gray value of the medium image is proportional to the reflectivity of the turbid medium, therefore the medium reflectivity ρSare supposed to be obtained by the pre-calibration of the transceiver system as following: a LLT experiment in absence of turbid media can be performed by using a reference target plate with uniform reflectivityρ0¯ as the imaging target, and the perpendicular-incidence reflectivityρ0¯ of this reference target plate needs to be measured in advance. The average gray value of the reference target image I0 can be represented as:

I0¯=CE0ρ0¯/Z02,
where E0 is the single pulse energy for the pre-calibration, C is a constant determined by the system parameters, andZ0 is the distance between the reference target and the camera respectively. In the LLT system, the target distance is provided byZ0=cτ0/2, in which cdenotes the speed of light and τ0denotes the gate delay of the target signal.

In the LLT measurements, as the gate range for the backscattering image of the turbid medium covers the entire medium layer as shown in Fig. 1, the intensity of a certain pixel (i,j) in the backscattering image can be represented as:

IS(i,j)=CESρS(x,y)/ZS2,
Here, ES is the single pulse energy, and ZSis the distance of the medium cluster. As shown in Fig. 1, the medium point positions (x,y,ZS) and the pixel coordinates (i,j) satisfy the conditions of projective transformation, i.e.,{idpix/f=x/ZSjdpix/f=y/ZS, where dpix is the pixel size and fis the focal length of the optical receiving system. The medium reflectivity ρS can be obtained by:

ρS(x,y)=E0ZS2ρ0¯ESZ02I0¯IS(i,j).

Moreover, in Eq. (2), there is still an unknown value, i.e., the LIDAR ratio S, which is supposed to be obtained by pre-measurement, or by pre-estimation through prior information of the turbid medium. By substituting S and the calibrated ρS into Eq. (2), the estimated degradation matrix can be achieved as:

V˜(i,j)=exp[2SE0ZS2ρ0¯ESZ02I0¯IS(i,j)].

The entire restoration process is shown in Fig. 2.

 figure: Fig. 2

Fig. 2 Flowchart of the proposed restoration method.

Download Full Size | PDF

3. Simulations

A three-dimensional distribution of the turbid medium for the simulation is designed as Fig. 3, in which a turbid medium layer is located between the target and the transceiver system; this medium layer causes the inhomogeneous degradation as well as provides the backscattering images. The gated medium layer and the target layer are denoted with dotted bordered rectangles in Fig. 3.

 figure: Fig. 3

Fig. 3 Turbid medium distribution in the simulation.

Download Full Size | PDF

The particle size distribution of the medium particles obeys the following Diermendjian distribution [23],

n(r)dr=arμexp(brν)dr.
Here, r is the particle radius, a, b, μ and ν are the four tunable Diermendjian parameters [24]; in the simulation, the four parameters are set to be a=1.085×102, b=1/24, μ=8 and ν=3, and the total particle number density can be calculated as N=0n(r)dr=100cm3. The complex refractive index of the turbid medium is set to bem=1.334-1.32×109i (in reference of the value of the water droplet at the wavelength of 532nm) [25]. According to the atmospheric parameters above, the extinction coefficient Ke and the backscattering coefficient β can be calculated as 1.128×102m1 and 5.896×104m1sr1respectively.

3.1 Pre-estimation of LIDAR ratio and pre-calibration of system parameters

For water clouds, the LIDAR ratio has a near constant value close to 19sr at 532 nm [26–29]. More specifically, five groups of Diermendjian parameters for water clouds, including those for the LLT simulation experiment (D2), are listed in the first four columns of Table 1, and their corresponding functions are plotted in Fig. 4; rm represents mode radius, i.e., the maximum point of the function. Since the medium particle is water droplet, i.e., the complex refractive index ism=1.334-1.32×109i, the LIDAR ratios S for the five groups of Diermendjian parameters are calculated as listed in the last column of Table 1. Obviously, in spite of different mode radiusrm, the LIDAR ratios remain close to 19sr. Due to the good robustness of the exponential factors in the degradation matrix equations [21], S˜=19sris chosen as the estimated LIDAR ratio for the degradation matrix establishment. Generally, for common turbid media, the LIDAR ratios can be estimated according to the medium types and the approximate ranges of the median particle size.

Tables Icon

Table 1. Five groups of Diermendjian parameters and the corresponding LIDAR ratioS

 figure: Fig. 4

Fig. 4 Diermendjian functions of the five groups of parameters in Table 1.

Download Full Size | PDF

In the pre-calibration in absence of turbid media, a reference target plate with perpendicular-incidence reflectivity of ρ0¯=0.5sr1was considered, the single pulse energy was E0=1mj and the distance between the reference target and the transceiver system wasZ0=100m. The reference target imageI0is shown in Fig. 5, and its average gray value is denoted byI0¯. The pre-calibration parameters such asE0, Z0, ρ0¯,I0¯ together with the LIDAR ratioS˜=19srall served as the prior parameters for the following image restoration.

 figure: Fig. 5

Fig. 5 Recovery process of the simulation images. V˜represents the estimated degradation matrix; E0, Z0 and ρ0¯ represent the single pulse energy, the target distance and the reference target reflectivity in the pre-calibration, respectively; I0¯denotes the average gray value of the reference target image.

Download Full Size | PDF

3.2 Simulation Verification

The simulated images and the recovery process of the target image are depicted in Fig. 5. The degradation matrix V˜ can be estimated by the backscattering imageIS together with the prior parameters such asE0, Z0, ρ0¯,I0¯ and S˜, and then the recovery target image U in Eq. (1) can be solved through a proper variational model [20].

The captured turbid-medium-free target image in Fig. 5 served as the approximate ideal target imageIideal, and the structural similarity (SSIM) indexes with a real number between 0 and 1 is used for the image quality assessment [30]. Obviously, the proposed method can improve the SSIM indexes remarkably, which indicates that the proposed method is effective in eliminating the influence of the turbid medium.

3.3 Homomorphic filtering recovery for comparison

As we know, the homomorphic filtering [12] is a common technique to extract high-frequency information by logarithmic transformation and frequency domain filtering. Here, we compare the proposed method with the blind digital image restoration algorithms based on the homomorphic filtering. The Butterworth filter functions [31] for the homomorphic filtering algorithm and their corresponding results for the single degraded target image (Fig. 7(b)) are shown in Figs. 6 and 7 respectively. The results indicate that, the homomorphic filtering can visually weaken the influence of the turbid medium to some extent. However, the homomorphic filtering will change the relative gray value, i.e., the relative reflectivity of the target image at the same time, which leads to lower SSIM indexes. Taking Fig. 7 as an example, the homomorphic filtering eliminates the shadows (yellow circle in Fig. 7(d)) caused by the turbid medium; however, it changes the gray values of the sub-windows (yellow, green and red circles) to almost the same level, which is far from the true reflectivity of the targets and consequently results in lower SSIM indexes. According to Fig. 7(c), obviously the proposed method can reveal the degradation area of the target image, avoiding changing the relative reflectivity of the other parts. The simulation results indicate that the proposed method is much better than the homomorphic filtering for inhomogeneous degradation restoration of target images.

 figure: Fig. 6

Fig. 6 Butterworth filter functions of homomorphic filtering.

Download Full Size | PDF

 figure: Fig. 7

Fig. 7 Recovery results of homomorphic filtering. (a) Ideal target image (b) Degraded target image (c) Recovery image by the proposed method; (d-f) show homomorphic filtering results of various Butterworth filter functions as shown in Fig. 6.

Download Full Size | PDF

4. Image restoration experiments and analysis

4.1 Pre-calibration measurements

The LLT experimental setup and the signal timing were similar with those in Ref [21]. A reference target plate with uniform reflectivityρ0¯was captured by the LLT system in absence of turbid media for pre-calibration. The perpendicular-incidence reflectivityρ0¯of the reference target plate was pre-measured by an optical power meter at various ranges, according to the following Eq. (8):

Pr=ρ0¯ArZr2Pi.
Here, Pi is the incident power, Pr is the received power, and the incident direction and the receiving direction are both normal to the surface of the reference target; Ar is the receiving area of the optical power meter, and Zr is the distance between the reference target and the optical power meter. The average reflectivity was obtained as ρ0¯=0.4643sr1 by linear fitting ofPr/Pi versusAr/Zr2, as shown in Fig. 8.

 figure: Fig. 8

Fig. 8 Linear fitting of Pr/Pi versus Ar/Zr2

Download Full Size | PDF

The reference target image I0 captured by the turbid-medium-free LLT experiment is shown in Fig. 9, and its average gray value is denoted byI0¯. In the turbid-medium-free LLT experiment, the single pulse energy E0 and the target distanceZ0, along with ρ0¯andI0¯, served as the prior parameters for the following image restoration process.

 figure: Fig. 9

Fig. 9 Recovery process of the first experimental image group.

Download Full Size | PDF

4.2 Restoration results

The LLT experiments with turbid media were performed, and the captured images along with the establishment of the degradation matrix are depicted in Fig. 9.

In the LLT experiments, the turbid medium particles were water droplets sprayed by an ultrasonic nebulizer that has a vibration frequency of 1.7MHz and can convert averagely 7ml of water per minute into water droplets via ultrasonic atomization [32]. According to Lang’s formula [32], the median particle diameter of the water droplets can be calculated as 2.92μm; combining with the analyses in Sec. 3.1, S˜=19srcan be chosen as the estimated LIDAR ratio. As shown in Fig. 9, the degradation matrix V˜ can be estimated by the backscattering imageIS together with the prior parameters such as E0, Z0, ρ0¯,I0¯and S˜; through the proposed variational model, the retrieved target images U can be solved by inputting the estimated degradation matrix V˜ and the degraded target image U0.

By considering the captured turbid-medium-free target images in Fig. 9 as the approximate ideal target imageIideal, the SSIM indexes are listed in the corresponding images. Obviously, the proposed method can improve the SSIM indexes remarkably, which indicates that the proposed method is effective in eliminating the inhomogeneous degradations caused by the turbid medium. The same procedures can be performed for the restorations of another two groups of experimental images, as shown in Figs. 10 and 11, which come to the same conclusions.

 figure: Fig. 10

Fig. 10 Recovery results of the second experimental image group. (a) Turbid-medium-free target image (b) Degraded target image (c) Backscattering image (d) Denoised backscattering image (e) Degradation matrix visualization (f) Retrieved target image.

Download Full Size | PDF

 figure: Fig. 11

Fig. 11 Recovery results of the third experimental image group.(a) Turbid-medium-free target image (b) Degraded target image (c) Backscattering image (d) Denoised backscattering image (e) Degradation matrix visualization (f) Retrieved target image.

Download Full Size | PDF

5. Conclusion

Based on LLT, assisted by the backscattering images and some prior parameters from pre-calibration, a novel and feasible image restoration method was developed to estimate the degradation caused by inhomogeneous turbid media, and further to eliminate the inhomogeneous degradation through a proper total variation model. The proposed method can work under common conditions and overcome the limitation of the previous method that only works well under the conditions of localized turbid attenuations and fairly uniform targets. The recovery process is unsupervised, which can improve the real-time property.

The proposed restoration method is based on the physical signal relevance between the target layer and the turbid medium layer, instead of a blind image restoration algorithm. Simulations and experiments are both conducted to verify the validity and feasibility of the proposed method. The results demonstrated that, the proposed method can effectively eliminate the interferences of inhomogeneous turbid media to achieve the real target images.

As the proposed method can reveal the true target behind inhomogeneous turbid media, it can reduce the false recognition rate for target recognition and identification. The proposed method potentially will be applied in target acquisition for observations in air and under water, military reconnaissance, and fire rescue.

Funding

National Natural Science Foundation of China (NSFC) NO.613192, 61070040, 61108089, 61205087 and 61107005.

Acknowledgment

The authors acknowledge Yanyun Ma for lending the measurement equipment for the experiments.

References and links

1. L. F. Gillespie, “Apparent illumination as a function of range in gated, laser night-viewing systems,” J. Opt. Soc. Am. 56(7), 883–887 (1966). [CrossRef]  

2. G. R. Fournier, D. Bonnier, J. L. Forand, and P. W. Pace, “Range-gated underwater laser imaging system,” Opt. Eng. 32(9), 2185–2190 (1993). [CrossRef]  

3. C. S. Tan, G. Seet, A. Sluzek, and D. M. He, “A novel application of range-gated underwater laser imaging system (ULIS) in near-target turbid medium,” Opt. Lasers Eng. 43(9), 995–1009 (2005). [CrossRef]  

4. J. Busck, “Underwater 3d optical imaging with a gated viewing laser radar,” Opt. Eng. 44(11), 116001 (2005). [CrossRef]  

5. R. G. Driggers, R. H. Vollmerhausen, N. Devitt, C. Halfort, and K. J. Barnard, “Impact of speckle on laser range-gated shortwave infrared imaging system target identification performance,” Opt. Eng. 42(3), 738–746 (2003). [CrossRef]  

6. R. L. Espinola, E. L. Jacobs, C. E. Halford, R. Vollmerhausen, and D. H. Tofsted, “Modeling the target acquisition performance of active imaging systems,” Opt. Express 15(7), 3816–3832 (2007). [CrossRef]   [PubMed]  

7. P. Andersson, “Long-range three dimensional imaging using range-gated laser radar images,” Opt. Eng. 45(3), 034301 (2006). [CrossRef]  

8. M. Laurenzis, F. Christnacher, and D. Monnin, “Long-range three-dimensional active imaging with superresolution depth mapping,” Opt. Lett. 32(21), 3146–3148 (2007). [CrossRef]   [PubMed]  

9. M. Laurenzis, F. Christnacher, N. Metzger, E. Bacher, and I. Zielenski, “3d range-gated imaging at infrared wavelengths with super-resolution depth mapping,” Proc. SPIE 7298, 729833 (2009). [CrossRef]  

10. R. C. Gonzalez and R. E. Woods, Digital Image Processing, 3rd ed (Prentice Hall, 2007).

11. C. Tomasi and R. Manduchi, “Bilateral filtering for gray and color images,” in Proceedings of IEEE International Conference on Computer Vision (IEEE, 1998), pp. 839–846. [CrossRef]  

12. A. V. Oppenheim, R. W. Schafer, and T. G. Stockham, “Nonlinear filtering of multiplied and convolved signals,” in Proceedings of IEEE Transactions on Audio and Electroacoustics (IEEE, 1968), 56 (8), 1264–1291.

13. F. M. Martin, M. E. Munoz, and L. C. Alberola, “A speckle removal filter based on anisotropic Wiener filtering and the Rice distribution,” in Proceedings of IEEE Ultrasonics Symposium (IEEE, 2006), 7(3), 1694 −1697.

14. D. G. Lainiotis, P. Papaparaskeva, and K. Plataniotis, “Nonlinear filtering for LIDAR signal processing,” Math. Probl. Eng. 2(5), 367–392 (1996). [CrossRef]  

15. L. Rudin, S. Osher, and E. Fatemi, “Nonlinear total variation based noise removal algorithms,” Physica D 60(1-4), 259–268 (1992). [CrossRef]  

16. A. Chambolle, “An algorithm for total variation minimization and applications,” J. Math. Imaging Vis. 20(1/2), 89–97 (2004). [CrossRef]  

17. T. F. Chan and S. Esedoglu, “Aspects of total variation regularized L1 function approximation,” SIAM J. Appl. Math. 65(5), 1817–1837 (2005). [CrossRef]  

18. J. F. Yang, Y. Zhang, and W. T. Yin, “An efficient TVL1 algorithm for deblurring multichannel images Corrupted by Impulsive Noise,” SIAM J. Sci. Comput. 31(4), 2842–2865 (2009). [CrossRef]  

19. Z. M. Jin and X. P. Yang, “A variational model to remove the multiplicative noise in ultrasound images,” J. Math. Imaging Vis. 39(1), 62–74 (2011). [CrossRef]  

20. W. J. Yi, W. Hu, P. Wang, and X. J. Li, “Image restoration method for longitudinal laser tomography based on degradation matrix estimation,” Appl. Opt. 55(20), 5432–5438 (2016). [CrossRef]   [PubMed]  

21. W. Yi, H. Liu, P. Wang, M. Fu, J. Tan, and X. Li, “Reconstruction of target image from inhomogeneous degradations through backscattering medium images using self-calibration,” Opt. Express 25(7), 7392–7401 (2017). [CrossRef]   [PubMed]  

22. F. G. Fernald, “Analysis of atmospheric lidar observations: some comments,” Appl. Opt. 23(5), 652–653 (1984). [CrossRef]   [PubMed]  

23. D. Deirmendjian, Electromagnetic Scattering on Spherical Polydispersions (American Elsevier Pub Co, 1969).

24. P. X. Sheng, Atmospheric physics, 2nd ed. (Peking University, 2013).

25. G. M. Hale and M. R. Querry, “Optical Constants of Water in the 200-nm to 200-microm Wavelength Region,” Appl. Opt. 12(3), 555–563 (1973). [CrossRef]   [PubMed]  

26. R. G. Pinnick, S. G. Jennings, P. Chylek, C. Ham, and W. T. Grandy Jr., “Backscatter and extinction in water clouds,” J. Geophys. Res. 88(C11), 6787–6796 (1983). [CrossRef]  

27. E. J. O’Connor, A. J. Illingworth, and R. J. Hogan, “A technique for autocalibration of cloud lidar,” J. Atmos. Ocean. Technol. 21(5), 777–786 (2004). [CrossRef]  

28. Y. X. Hu, “Depolarization ratio-effective lidar ratio relation: Theoretical basis for space lidar cloud phase discrimination,” Geophys. Res. Lett. 34(11), L11812 (2007). [CrossRef]  

29. J. H. Churnside, J. M. Sullivan, and M. S. Twardowski, “Lidar extinction-to-backscatter ratio of the ocean,” Opt. Express 22(15), 18698–18706 (2014). [CrossRef]   [PubMed]  

30. Z. Wang, A. C. Bovik, H. R. Sheikh, and E. P. Simoncelli, “Image quality assessment: from error visibility to structural similarity,” in Proceedings of IEEE Transactions on Image Processing (IEEE, 2004), 13(4), 600–612. [CrossRef]  

31. H. G. Adelmann, “Butterworth equations for homomorphic filtering of images,” Comput. Biol. Med. 28(2), 169–181 (1998). [CrossRef]   [PubMed]  

32. R. J. Lang, “Ultrasonic atomization of liquids,” J. Acoust. Soc. Am. 34(1), 6–8 (1962). [CrossRef]  

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (11)

Fig. 1
Fig. 1 Schematic diagram of the LLT system.
Fig. 2
Fig. 2 Flowchart of the proposed restoration method.
Fig. 3
Fig. 3 Turbid medium distribution in the simulation.
Fig. 4
Fig. 4 Diermendjian functions of the five groups of parameters in Table 1.
Fig. 5
Fig. 5 Recovery process of the simulation images. V ˜ represents the estimated degradation matrix; E 0 , Z 0 and ρ 0 ¯ represent the single pulse energy, the target distance and the reference target reflectivity in the pre-calibration, respectively; I 0 ¯ denotes the average gray value of the reference target image.
Fig. 6
Fig. 6 Butterworth filter functions of homomorphic filtering.
Fig. 7
Fig. 7 Recovery results of homomorphic filtering. (a) Ideal target image (b) Degraded target image (c) Recovery image by the proposed method; (d-f) show homomorphic filtering results of various Butterworth filter functions as shown in Fig. 6.
Fig. 8
Fig. 8 Linear fitting of P r / P i versus A r / Z r 2
Fig. 9
Fig. 9 Recovery process of the first experimental image group.
Fig. 10
Fig. 10 Recovery results of the second experimental image group. (a) Turbid-medium-free target image (b) Degraded target image (c) Backscattering image (d) Denoised backscattering image (e) Degradation matrix visualization (f) Retrieved target image.
Fig. 11
Fig. 11 Recovery results of the third experimental image group.(a) Turbid-medium-free target image (b) Degraded target image (c) Backscattering image (d) Denoised backscattering image (e) Degradation matrix visualization (f) Retrieved target image.

Tables (1)

Tables Icon

Table 1 Five groups of Diermendjian parameters and the corresponding LIDAR ratioS

Equations (8)

Equations on this page are rendered with MathJax. Learn more.

U d =VU+N.
V(i,j)=exp[2S ρ S (x,y)].
I 0 ¯ =C E 0 ρ 0 ¯ / Z 0 2 ,
I S (i,j)=C E S ρ S (x,y)/ Z S 2 ,
ρ S (x,y)= E 0 Z S 2 ρ 0 ¯ E S Z 0 2 I 0 ¯ I S (i,j).
V ˜ (i,j)=exp[2S E 0 Z S 2 ρ 0 ¯ E S Z 0 2 I 0 ¯ I S (i,j)].
n(r)dr=a r μ exp(b r ν )dr.
P r = ρ 0 ¯ A r Z r 2 P i
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.