Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Polarization-based exploration for clear underwater vision in natural illumination

Open Access Open Access

Abstract

Underwater imaging provides human vision system friendly images; however, it often suffers from severe image degradation. This research developed an underwater polarization imaging model, which considers the water scattering effect, as well as absorption effect. It fully explored the polarization information of the target scene that backscattered light is partially polarized and target light is unpolarized. Then backscattered light is first estimated and removed. The target scene’s distance information is derived based upon the polarization information, and then applied to develop a distance-based Lambertian model. This model enables estimation of the intensity loss caused by water absorption and accurate target radiance recovery. Furthermore, real-world experiments show that the developed model handled the underwater image degradation well. In particular, it enables effective color cast correction resulting from water absorption, which traditional imaging methods always ignore.

© 2019 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

1. Introduction

Underwater imaging is of crucial importance to both scientific research and marine rescue, which has been benefitted from its ability to directly provide visual feedbacks of underwater targets [1,2]. However, due to strong absorption and scattering during light propagation in water, underwater images always suffer from poor vision effect and low contrast [3–5]. Absorption removes light from the path permanently and cause light energy loss. Scattering redirects the direction of light propagation and brings undesired light into the optical path, leading to “veiled” images [6,7]. Many researchers tackle the problem of light energy loss by employing artificial illumination, and positive results are expectable [8,9]. But in water areas where natural light dominates the illumination, artificial illumination will complicate the imaging conditions. Active imaging approaches will no longer be the optimal choice, and passive imaging strategy is preferred in these areas. Y. Y Schechner developed a passive imaging method to image in neritic areas and obtained positive results. Figure 1(a) displays one of his raw imaging results in Mediterranean [10]. It’s an underwater image characterized by a whole blue appearance and reduced contrast which is not desired in any underwater imaging application. This issue majorly results from the wavelength-dependent absorption of light in water and light scattering. In nature, waters can be greenish or in other color due to suspended particles in different kinds, but in this study, we majorly take bluish colorcast as an example to discuss the research. In the visible-light range, light at longer wavelength can be strongly absorbed, while only blue light remains and is strongly scattered, as demonstrated in Fig. 1(b). Previous studies handle the color cast problem majorly by traditional image processing techniques such as high pass filtering, histogram equalization and so on [11,12]. For example, white balancing is employed by Y. Y Schechner in [13]. These methods take only the problems of images into account and implement the same process to all images without considering the imaging mechanism. For example here, underwater image degradation results from water scattering and absorption, which are problems that image processing won’t even care. Hence, in most cases unsatisfactory results are achieved.

 figure: Fig. 1

Fig. 1 (a) An underwater image of the Mediterranean [10]; (b) Measured absorption and scattering coefficients of the clearest natural water [14].

Download Full Size | PDF

In this study, we develop an underwater imaging model for clear underwater vision in natural illumination. It attempts to deal with problems including low contrast and color cast caused by water scattering and absorption, especially focuses on recovering true color of underwater images without additional image processing. It takes fully advantage of two polarization image of the target scene in two orthogonal polarization states. The target light and backscattered light differ in polarization behaviors. In detail, the scattered light is partially polarized and the reflected target light is unpolarized due to the depolarization process. Scattering light is estimated based upon the polarization difference. And energy loss caused by absorption is compensated by exploring the distance information of the target scene derived from polarization images. With the backscattered light removed and absorption effects compensated, clear underwater vision is achievable. To demonstrate the developed approach, experiments were completed with a polarization imaging system, and significant improvement of contrast and color are obtained.

In the following part, we will first analyze how underwater image degradation generates in Section 2, and introduce how the developed model handles these problems in Section 3. Finally, experiments and results are given in Section 4 with detailed analyses about the advantages and shortcomings of the developed model.

2. Image degradation in water

The schematic diagram in Fig. 2 shows how image degrades in passive underwater imaging, where natural light dominates the illumination [10]. Sunlight impinges onto water surface and propagates in water, part of which reaches objects in water and then be reflected. The reflected light by objects acts as the target light and the rest propagating towards the detector appears as backscattered light. These two components compose a captured underwater image, as shown by Eq. (1),

L(x,y)=wL(x,y,λ)dλ=wLT(x,y,λ)+LB(x,y,λ)dλ,
where L(x,y,λ) indicates the light intensity at the wavelength λ, w indicates the response waveband of the detector in which a response function s(λ) indicates the response of the detector to light at each wavelength, LT and LB are the target light and backscattered light, respectively.

 figure: Fig. 2

Fig. 2 Degradation in underwater imaging.

Download Full Size | PDF

Before finally detected, both target light and backscattered light undergo attenuation [15,16]. While the inevitable attenuation leads to the final detected target light LT into the form in Eq. (2).

LT(x,y,λ)=Lobject(x,y,λ)exp[c(λ)z(x,y)],
where c(λ)is the attenuation coefficient which is the sum of the absorption and scattering coefficients detailed later [17,18], z is distance information, (x, y) indicates the position of one pixel in the captured image, and Lobject denotes the target irradiance before any attenuation, which is the only desired signal for imaging. However, along with the target light there is always as shown in Eq. (3). Backscattered light LB majorly stems from ambient illumination scattered into the line of sight (LOS) by suspended particles [19],
LB(x,y,λ)=LB(λ){1exp[b(λ)z(x,y)]},
where LB means the backscattered light in the LOS extending to infinity, b(λ) means scattering coefficient which is defined as the fraction of the incident flux that is scattered, divided by the thickness of the layer [17], exp[b(λ)z(x,y)] indicates the effects of scattering particles on backscattered light. Therefore, the radiance finally reached the detector, L(x,y), can be expressed by Eq. (4),

L(x,y)=wLobject(x,y,λ)exp[c(λ)z(x,y)]+LB(λ){1exp[b(λ)z(x,y)]}dλ.

3. Model and method

Typically, most underwater imaging methods handle the degradation problem by removing the backscattered light caused by scattering process [9,20,21]. The attenuation is attributed to scattering of water and suspended particles to light. Descattering methods could provide contrast-enhanced images since scattering redirects the direction of light propagation and brings undesired light into the optical path, leading to “veiled” images, i.e., contrast-reduced images. While in underwater scenes, water absorption removes light from the path permanently causing light energy lost. The severe color cast illustrated in Fig. 1 majorly stems from the wavelength-selective absorption and scattering of water. As we’re familiar with, in the visible-light range, long-wavelength light, for example red light, will experience more absorption events than short-wavelength light, for example blue light [22,23]. This explains the phenomenon that underwater images always present blue appearance where red light is almost totally attenuated. For example, in Fig. 2, backscattered light results from attenuated sunlight and target light comes from sunlight reflected by targets. Before arriving at the imaging system, both backscattered light and target light are attenuated into blue-dominated components. In fact, an ideal target image should better include only reflected light by targets that is not attenuated at all, as shown in Fig. 2, i.e. the object radiance. In this study, to better handle the image degradation in water, we develop a model where the effects of scattering and absorption are both addressed, see Eq. (5),

L(x,y,λ)=LT(x,y,λ)+LB(x,y,λ)=Lobject(x,y,λ)exp{[b(λ)+a(λ)]z(x,y)}+LB(λ){1exp[b(λ)z(x,y)]}=Lobject(x,y,λ)exp[a(λ)z(x,y)]exp[b(λ)z(x,y)]+LB(λ){1exp[b(λ)z(x,y)]}=Lobject'(x,y,λ)exp[b(λ)z(x,y)]+LB(λ){1exp[b(λ)z(x,y)]}
where
Lobject'(x,y,λ)=Lobject(x,y,λ)exp[a(λ)z(x,y)],
LT(x,y,λ) indicates the detected target radiance, LB(x,y,λ) denotes the backscattered light, a(λ) is the absorption coefficient which is defined as the fraction of the incident flux that is absorbed, divided by the thickness of the layer [17], and Lobject'(x,y,λ) indicates the degraded image resulting from water absorption.

Regarding the degradation resulting from scattering, the polarization characteristic of light provides a helpful support. Generally, natural illumination originates from the sun and sky above the water and is usually unpolarized [24,25]. But underwater scattering involves polarization effects. Scattering in water in a highly anisotropic manner makes the natural veiling significantly partially polarized. We use a polarizer while imaging to detect different polarization components. Since the backscattered light is partially polarized, the image intensity depends on the rotation of the polarizer. There are two orthogonal orientations of the polarizer, at which its transmittance of the backscattered light reaches extremum values LBmax(x,y,λ) and LBmin(x,y,λ).

LB(x,y,λ)=LBmax(x,y,λ)+LBmin(x,y,λ).

About the target light, we can take it as unpolarized due to the depolarization in target surface [8].Two different images Lmax and Lmin are required as shown in Eq. (8) when the polarizer is set at two orthogonal orientations,

Lmax(x,y,λ)=LT(x,y,λ)/2+LBmax(x,y,λ)Lmin(x,y,λ)=LT(x,y,λ)/2+LBmin(x,y,λ).

The degree of polarization (DOP) of backscattered light is a crucial parameter in the developed model and can be calculated according to its definition shown in Eq. (9),

p=LBmaxLBminLBmax+LBmin.
In our case, to obtain the DOP of backscattered light, we select an area where there exists no object, denote it with Xvoid where all the irradiance is contributed by the backscattered light. Therefore, the DOP of backscattered light and the backscattered light at an infinity distance can then be achieved based upon the area Xvoid, as expressed by Eqs. (10) and (11), respectively,
p(λ)=LBmax(λ)|XvoidLBmin(λ)|XvoidLBmax(λ)|Xvoid+LBmin(λ)|Xvoid,
LB(λ)=LBmax(λ)|Xvoid+LBmin(λ)|Xvoid.
The backscattered light in the target scene can then be estimated by Eq. (12),
LB(x,y,λ)=Lmax(x,y,λ)Lmin(x,y,λ)p(λ).
With Eqs. (5)-(12), problem caused by water scattering can be handled as shown by Eq. (13),

Lobject'(x,y,λ)=Lobject(x,y,λ)exp[a(λ)z(x,y)]=L(x,y,λ)LB(x,y,λ)1[LB(x,y,λ)/LB(λ)].

However, unlike scattering, absorption of water leads to totally different image degradation to underwater images. In underwater imaging, polarization characteristic stems from the scattering process. It works well in removing backscattered light, but can’t compensate the loss of light intensity caused by absorption. Therefore, special work to handle the problems caused by water absorption is completed based on establishing a distance-based Lambertian model. It takes the light intensity of a captured image as the combined result of light source, object surface reflectance and the response function of the detector, as shown in Eq. (14),

f(x,y)=wl(λ)Lobject'(x,y,λ)s(λ)dλ,
where f(x, y) is the finally detected image, l(λ) denotes the power emitted by illuminant at wavelength λ in plane (x, y). Equation (14) denotes the Lambertian model that works in the air where light is slightly attenuated [26]. However, in underwater conditions, the absorption process will definitely lead to severe light attenuation. The model can then evolve into the form in Eq. (15),
f(x,y)=wl(λ)Lobject(x,y,λ)exp[a(λ)z(x,y)]s(λ)dλ=wl(λ)Lobject(x,y,λ)d(x,y,λ)s(λ)dλ,
where d(x, y, λ) quantifies the energy loss caused by water absorption and highly depends on the distance information z. The desired signal is the target radiance before attenuation Lobject(x, y, λ).

Taking the case at a specially determined wavelength as an example, we can first denote the wavelength with λ0, which makes Eq. (15) into the form of Eq. (16),

f(x,y,λ)|λ=λ0=l(λ0)Lobject(x,y,λ0)exp[a(λ0)z(x,y)]s(λ0)=l(λ0)Lobject(x,y,λ0)d(x,y,λ0)s(λ0).

To recover the clear scene image Lobject(x, y, λ), the spatial distribution of light source, the response function of the detector and the absorption effect d(x, y, λ) should be clearly evaluated. According to the study of Xinwei Zhao [27], an approximation is made that the backscattered light at infinity is directly proportional to the scattering coefficient and inversely proportional to the absorption coefficient,

LB(λ)b(λ)a(λ).
From the relation in Eq. (17), the following expression is derived,
a(λ)a(λ')=b(λ)LB(λ')b(λ')LB(λ)=γ,
where λ' is a reference wavelength and γ is a constant. Through the model in Eq. (13) the scattering coefficient b(λ)is achievable and with Eq. (11) LB(λ) can be obtained [13]. The absorption coefficient at the reference wavelength is approximated by Eq. (19),
a(λ')=ln(1LB(x,y,λ')LB(λ'))/z.
Combining Eqs. (18) and (19) we can quantify the energy loss caused by water absorption by Eq. (20),

d(x,y,λ)=exp[γa(λ')z]=exp[a(λ)z].

According to the Grey World assumption, the color in each sensor channel averages to gray over the entire image (field of view). Here we employ the assumption as a normalisation constraint. Assuming that we have a good distribution of colors in our scene, the average reflected color is assumed to be the color of light source [28], which makes the conclusion in Eq. (21) valid,

d(x,y,λ)|λ=λ0Lobject(x,y,λ)|λ=λ0dxdyd(x,y,λ)|λ=λ0dxdy=k.

Based on Eqs. (16) and (21), the finally captured image and the absorption effect can be related by Eq. (18) and the relation in Eq. (22) is then derived,

f(x,y,λ)|λ=λ0dxdyd(x,y,λ)|λ=λ0dxdy=1d(x,y,λ0)dxdyl(λ0)Lobject(x,y,λ0)d(x,y,λ0)s(λ0)dxdy=l(λ0)(d(x,y,λ0)Lobject(x,y,λ0)dxdyd(x,y,λ0)dxdy)s(λ0).=l(λ0)s(λ0)k
From Eq. (22), we can evaluate the effects of light source distribution and detector response by Eq. (23),
l(λ0)s(λ0)=f(x,y,λ0)dxdyd(x,y,λ0)dxdyk1.
Till now, the clear scene information at a specially determined wavelength Lobject(x,y,λ)|λ=λ0 is achievable by combining Eqs. (5) (13) (16) and (23) as shown in Eq. (24),

Lobject(x,y,λ)|λ=λ0=L(x,y,λ0)LB(x,y,λ0)[1LB(x,y,λ0)LB(λ0)][f(x,y,λ0)dxdyd(x,y,λ0)dxdyk1]d(x,y,λ0)dxdy.

Considering that a captured image is a combination of information in the whole visible-light wavelength range, we can recover the clear image of the target underwater scene by integrating Eq. (24) in the visible-light wavelength range w,

Lobject(x,y)=wLobject(x,y,λ)dλ.

4. Results and analyses

4.1 Experiments and results

Figure 3 presents the flowchart of the established underwater imaging model, which takes the water absorption effects into consideration. It works well in correcting the color cast caused by the wavelength-dependent water absorption to light. To demonstrate its feasibility, we took real-world underwater imaging experiments. In the experiments, the imaging device is an 8 bit digital COMS (Complementary Metal Oxide Semiconductor) camera (Canon E77D) with a pixel number of 4000 × 6000. A polarizer is placed in front of the camera to capture polarization information of the target. In addition, to mimic the scattering and absorption conditions in seawater, we used the solution of skimmed milk and tap water in a 250 cm × 50 cm × 30 cm water tank in the experiments. The imaging path is set up outside the room. It is necessary to make sure blue light is the major component in illumination, so we use non-polarized blue light source at 470nm and natural light source.

 figure: Fig. 3

Fig. 3 Schematic diagram of the developed imaging model.

Download Full Size | PDF

In experiments, a rough-surface plastic Rubik’s cube was chosen as the first target with six bright colors, white, red, blue, green, yellow and orange. Its colors made it a perfect choice to demonstrate the color cast caused by water absorption and evaluate the ability of the proposed method in correcting color cast. As introduced in Sec. 3, two polarization images are required, one of which maximizes the backscattered light and the other one minimizes the backscattered light. Then images of the Rubik’s cube were taken by rotating polarizer to two needed orthogonal orientations, as shown in Figs. 4(a)-4(b). Both images suffer from severe degradation, but in image Lmin the backscattered light is slightly better suppressed. This slight difference between Lmax and Lmin proves the polarization signature of the backscattered light, on which the proposed model is majorly based. By estimating the backscattered light with Eq. (12), we can handle the scattering problem in underwater imaging. Figure 4(c) illustrates the estimated backscattered light. It is characterized by a distinct veiling appearance which proves the effect of scattering on images is contrast reduction. Backscattered light adds a veil in front of the target. Most underwater imaging methods handle the degradation problem by descattering, as we do here by Eq. (13). It enables removal of backscattered light to get more accurate target information. However, the recovered target signal Lobject' is actually attenuated target signal due to water absorption. The color cast caused by light absorption still can’t be solved by traditional descattering and will lead to color distortion. The developed model pays special attention to this problem by introducing distance information into the Lambertian reflection model as detailed in Section 3. It quantifies the effects of water absorption on light intensity by calibrating light source spatial distribution with d(x, y). Original target signal is achievable through light source intensity inversion. Figure 4(d) offers an example of the finally detected target image which has a clear appearance and vivid color. It proves the ability of the developed model in recovering target signal before absorption. The improvement implies the fact that, in underwater imaging, the severe color cast in image majorly results from water absorption to light which highly depends on light wavelength. If we focus only on handling problems stemmed from light scattering, it’s hardly possible to get accurate information of underwater targets.

 figure: Fig. 4

Fig. 4 (a) Lmax; (b) Lmin; (c) Estimated backscattered light; (d) Finally detected image.

Download Full Size | PDF

To make a distinctive demonstration, in Fig. 5 we measured the intensity of each piece of the Rubik’s cube in the red (R), green (G) and blue (B) channels of the directly captured underwater image and the finally detected image, and made a comparison with that of the real target image. Compared with the real appearance of the target, the directly captured target image is concealed by a whole blue cover. This accounts for the weak intensity in R and G channels as shown in Figs. 5(b1) - 5(d1). We can expect that a satisfactory image should provide a balanced intensity distribution in the R, G and B channels as that we can find in Figs. 5(b3) - 5(d3) for the real target image. As can be predicted, the developed model is totally competent to provide a clear image with balanced color distribution as demonstrated in Figs. 5(a2) - 5(d2). Besides, we randomly chose four pieces of the Rubik for comparison. Results are shown in Figs. 5(a4)-5(d4), where intensity of each channel is included. Every piece of the underwater image suffers from uneven color distribution in each channel. But the final result is well-recovered. For example, Piece II in white color, the original image is dominated by blue, but the recovered image is well-distributed in every channel. The technique will strongly support the applications of underwater imaging and target detection.

 figure: Fig. 5

Fig. 5 (a1) A directly detected underwater image; (b1)(c1)(d1) Intensity distribution in the R, G and B channels of (a1); (a2) Finally detected image; (b2)(c2)(d2) Intensity distribution in the R, G and B channels of (b2); (a3) Raw image of the target; (b3)(c3)(d3) Intensity distribution in the R, G and B channels of (a3); (a4)(b4)(c4)(d4) Intensity statistic of Block I, II, III and IV from (a1) (a2) and (a3) in R, G and B channels.

Download Full Size | PDF

For targets with considerable details, for instance the target in Fig. 6(a), it’s of great importance to keep the details while addressing image degradation. Compared with the directly captured image, the finally recovered image presents a quite clear appearance and natural color behavior. As an attempt to quantify the improved clarity and corrected color, a horizontal line plot for each channel of Figs. 6(a)-6(b) is generated by plotting the intensity value verses the horizontal position at a given vertical position. Choosing the vertical position pixel 211, a horizontal line passes through the letters “XiDian” in the first row as indicated by the white dotted lines. This provides a simple background and clear signals. Figures 6(c) and 6(e) evidently illustrate the typical signatures of underwater images, which are dominated by blue color caused by water absorption of light and low contrast caused by light scattering. The line indicating the blue channel of Fig. 6(a) presents much stronger intensity than that of the other two channels, which agrees well with the expected behavior. While in the finally recovered image, all of the three channels present a well-matched intensity distribution. Taken as an example, the second letter “i” in the first row provides a much stronger intensity in the red channel compared with the other two channels. This is impossible to be observed in Fig. 6(a) due to the strongly absorbed red light component. In addition, Figs. 6(e)-6(f) make a more distinctive demonstration by making statistics of the pixels distribution of Fig. 6(a)-6(b) in R, G and B channels. Other than that in Fig. 6(e) the pixels are densely distributed, Fig. 6(f) provides a balanced pixel distribution, indicating a well-recovered image.

 figure: Fig. 6

Fig. 6 (a) Lmax of words on paper; (b) Finally detected image; (c) and (d) Intensity profiles along the white dotted lines in Figs. 6(a)-6(b); (e) and (f) Pixel distributions of Figs. 6(a)-6(b) in R G and B channels.

Download Full Size | PDF

4.2 Noise analyses

We finally applied the developed model to a natural underwater image as shown in Fig. 7(a). It is taken by Y. Y Schechner at the Mediterranean. It’s a typical neritic image with a whole blue appearance covered over the seabed. The corals and rocks on the seabed are severely obscured due to color cast resulting from strong water scattering to blue light and strong absorption to red light. Figure 7(b) displays the finally recovered image by the developed model, where considerable details are revealed with well-corrected color as we discussed before. However, we may notice that the recovered image suffers from noise, especially the areas in the distance at the top of the image. The phenomenon stems from the noise amplification effect of the reconstruction process. As introduced in Section 3, the finally reconstructed result highly depends on exp(bz), where z is the distance information of the target scene. This dependence leads to the variation of noise with distance. We can make a further exploration by analyzing the dependence of noise on distance. First, suppose the two measurements, Lmax and Lmin, are statistically independent, with the respective noise variance σLmax2 and σLmin2. Let variable υ be a function of Lmax and Lmin. The noise variance of υ in a first order approximation can be calculated by Eq. (26) [8,29],

 figure: Fig. 7

Fig. 7 (a) A raw underwater image of the Mediterranean; (b) Finally detected result.

Download Full Size | PDF

συ2=(υLmin)2σLmin2+(υLmax)2σLmax2.

According to the model in Eq. (24), the clear target scene can be expressed by Eq. (27),

Lobject=(11p)Lmax+(1+1p)Lmin[exp(bz)]η,
where η=[f(x,y,λ0)dxdyd(x,y)dxdyk1]d(x,y)dxdy.

Following Eqs. (26) and (27), the noise variance in Lobject is expressed by,

σLobject2={11p[exp(bz)]η}2σLmax2+{1+1p[exp(bz)η]}2σLmin2.

It is obvious that if z, then exp(bz)0, the reconstruction is unstable. If the characteristic of noise is signal-independent. Then we have σLmax2=σLmin2=σ and the expression in Eq. (29),

σLobject2=2σ2(1+1p2){[exp(bz)]η}2.

Figure. 8 depicts σLobject2 as a function of distance z as derived in Eq. (29). It clearly depicts how distance will affect image noise. In the case exp(bz)0 corresponding to areas in the distance, the noise amplification is unbounded.

 figure: Fig. 8

Fig. 8 Noise variance in the finally detected image as a function of target distance z and σ2.

Download Full Size | PDF

As discussed before, image noise highly depends on distance, but this dependence also provides potential information to suppress image noise. Here we give an example. Normalize the distance map of the underwater image as shown in Fig. 9(a), where the gray value is directly proportional to distance. Take the distance information as a weight function to determine the filter size and then the de-noising process is completed by,

Idn=FWIn.
where In is the image with spatially varying noise, Idn is the de-noised image, and FW represents the filtering templates. Here the subscript W is the weight factor based upon which the optimal filter template will be selected. For example, we use a minimum filter template of a 3 × 3 for the closest targets and a 5 × 5 filter template for the farthest areas at the top of the image. Figure. 9(d) gives an example of the finally de-noised underwater image, where the noise in the top of the image is well-suppressed. Meanwhile, the details in the bottom areas are well-retained benefitting from a favorable selection of filter. Further de-noising methods could also be developed based on distance information in the target scene.

 figure: Fig. 9

Fig. 9 (a) Range map of Fig. 7(a); (b) De-noised image based upon the distance information; (c) and (d) Enlarged view of the marked out part in Fig. 8(b) and Fig. 9(b).

Download Full Size | PDF

5. Conclusion

In this study, we developed an underwater imaging model by taking both scattering and absorption effects into consideration, which was based upon the polarization characteristic of light. Besides the scattering effects, it especially focused on addressing problems produced by water absorption to light. Since water absorption to light is wavelength dependent, it majorly leads to severe color cast in underwater images. The developed model enables color cast correction by modeling the absorption process with distance information of the target scene derived from the polarization images. Afterwards, real-world experiments demonstrate that this model is effective in imaging through underwater conditions. In particular, it not only handles the scattering problem that most underwater imaging methods focus on, but also addresses problems caused by absorption effect, so that clear images without color cast are achievable with the polarization-based model. It is of great value to promote research in needs of clear target appearance, such as archology, marine species monitoring and so on.

Note that the final image suffers from severe noise. It might fail the target detection in some cases. Future work will be focused on handling this accompanied issue. Meanwhile, developing real-time polarization imaging will also considerably assist its applications.

Funding

Ministry of Education of the People's Republic of China (111 Project, B17035); National Natural Science Foundation of China (61705175); Natural Science Foundation of Shaanxi Province (2018JQ6010).

References

1. C. Tan, G. Seet, A. Sluzek, X. Wang, C. T. Yuen, C. Y. Fam, and H. Y. Wong, “Scattering noise estimation of range-gated imaging system in turbid condition,” Opt. Express 18(20), 21147–21154 (2010). [CrossRef]   [PubMed]  

2. D. M. Kocak, F. R. Dalgleish, F. M. Caimi, and Y. Y. Schechner, “A focus on recent developments and trends in underwater imaging,” Mar. Technol. Soc. J. 42(1), 52–67 (2008). [CrossRef]  

3. N. K. Soni, R. V. Vinu, and R. K. Singh, “Polarization modulation for imaging behind the scattering medium,” Opt. Lett. 41(5), 906–909 (2016). [CrossRef]   [PubMed]  

4. J. Y. Chiang and Y. C. Chen, “Underwater image enhancement by wavelength compensation and dehazing,” IEEE Trans. Image Process. 21(4), 1756–1769 (2012). [CrossRef]   [PubMed]  

5. P. Han, F. Liu, K. Yang, J. Ma, J. Li, and X. Shao, “Active underwater descattering and image recovery,” Appl. Opt. 56(23), 6631–6638 (2017). [CrossRef]   [PubMed]  

6. J. Park, J.-H. Park, H. Yu, and Y. Park, “Focusing through turbid media by polarization modulation,” Opt. Lett. 40(8), 1667–1670 (2015). [CrossRef]   [PubMed]  

7. X. Ni and R. R. Alfano, “Time-resolved backscattering of circularly and linearly polarized light in a turbid medium,” Opt. Lett. 29(23), 2773–2775 (2004). [CrossRef]   [PubMed]  

8. T. Treibitz and Y. Y. Schechner, “Active Polarization Descattering,” IEEE Trans. Pattern Anal. Mach. Intell. 31(3), 385–399 (2009). [CrossRef]   [PubMed]  

9. M. Dubreuil, P. Delrot, I. Leonard, A. Alfalou, C. Brosseau, and A. Dogariu, “Exploring underwater target detection by imaging polarimetry and correlation techniques,” Appl. Opt. 52(5), 997–1005 (2013). [CrossRef]   [PubMed]  

10. Y. Y. Schechner and N. Karpel, “Recovery of underwater visibility and structure by polarization analysis,” IEEE J. Oceanic Eng. 30(3), 570–587 (2005). [CrossRef]  

11. S. G. Narasimhan and S. K. Nayar, “Contrast restoration of weather degraded images,” IEEE Trans. Pattern Anal. Mach. Intell. 25(6), 713–724 (2003). [CrossRef]  

12. X. Li, H. Hu, L. Zhao, H. Wang, Y. Yu, L. Wu, and T. Liu, “Polarimetric image recovery method combining histogram stretching for underwater imaging,” Sci. Rep. 8(1), 12430 (2018).

13. Y. Y. Schechner and N. Karpel, “Clear underwater vision,” in Computer Vision and Pattern Recognition (IEEE, 2004), pp. 536–543.

14. R. C. Smith and K. S. Baker, “Optical properties of the clearest natural waters (200-800 nm),” Appl. Opt. 20(2), 177–184 (1981). [CrossRef]   [PubMed]  

15. B. Huang, T. Liu, H. Hu, J. Han, and M. Yu, “Underwater image recovery considering polarization effects of objects,” Opt. Express 24(9), 9826–9838 (2016). [CrossRef]   [PubMed]  

16. C. D. Mobley, Light and water: radiative transfer in natural waters (Academic press, 1994).

17. C. D. Mobley and D. Stramski, “Influences of microbial particles on oceanic optics,” in Ocean Optics XII(International Society for Optics and Photonics, 1994), pp. 184–194.

18. J. Hedley, “A three-dimensional radiative transfer model for shallow water environments,” Opt. Express 16(26), 21887–21902 (2008). [CrossRef]   [PubMed]  

19. S. A. Kartazayeva, X. Ni, and R. R. Alfano, “Backscattering target detection in a turbid medium by use of circularly and linearly polarized light,” Opt. Lett. 30(10), 1168–1170 (2005). [CrossRef]   [PubMed]  

20. G. D. Gilbert and J. C. Pernicka, “Improvement of Underwater Visibility by Reduction of Backscatter with a Circular Polarization Technique,” Appl. Opt. 6(4), 741–746 (1967). [CrossRef]   [PubMed]  

21. S. Fang, X. S. Xia, X. Huo, and C. W. Chen, “Image dehazing using polarization effects of objects and airlight,” Opt. Express 22(16), 19523–19537 (2014). [CrossRef]   [PubMed]  

22. M. Babin, D. Stramski, G. M. Ferrari, H. Claustre, A. Bricaud, G. Obolensky, and N. Hoepffner, “Variations in the light absorption coefficients of phytoplankton, nonalgal particles, and dissolved organic matter in coastal waters around Europe,” J. Geophys. Res. Oceans 108(C7), 3211–3230 (2003). [CrossRef]  

23. R. M. Pope and E. S. Fry, “Absorption spectrum (380-700 nm) of pure water. II. Integrating cavity measurements,” Appl. Opt. 36(33), 8710–8723 (1997). [CrossRef]   [PubMed]  

24. Y. Y. Schechner, S. G. Narasimhan, and S. K. Nayar, “Polarization-based vision through haze,” Appl. Opt. 42(3), 511–525 (2003). [CrossRef]   [PubMed]  

25. M. Bass, C. DeCusatis, J. Enoch, V. Lakshminarayanan, G. Li, C. Macdonald, V. Mahajan, and E. V. Stryland, Handbook of Optics (McGraw-Hill Education, 2009).

26. R. Basri and D. W. Jacobs, “Lambertian reflectance and linear subspaces,” IEEE Trans. Pattern Anal. Mach. Intell. 25(2), 218–233 (2003). [CrossRef]  

27. X. Zhao, T. Jin, and S. Qu, “Deriving inherent optical properties from background color and underwater image enhancement,” Ocean Eng. 94, 163–172 (2015). [CrossRef]  

28. J. van de Weijer, T. Gevers, and A. Gijsenij, “Edge-based color constancy,” IEEE Trans. Image Process. 16(9), 2207–2214 (2007). [CrossRef]   [PubMed]  

29. R. Durrett, Probability: theory and examples (Cambridge University, 2010).

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (9)

Fig. 1
Fig. 1 (a) An underwater image of the Mediterranean [10]; (b) Measured absorption and scattering coefficients of the clearest natural water [14].
Fig. 2
Fig. 2 Degradation in underwater imaging.
Fig. 3
Fig. 3 Schematic diagram of the developed imaging model.
Fig. 4
Fig. 4 (a) Lmax; (b) Lmin; (c) Estimated backscattered light; (d) Finally detected image.
Fig. 5
Fig. 5 (a1) A directly detected underwater image; (b1)(c1)(d1) Intensity distribution in the R, G and B channels of (a1); (a2) Finally detected image; (b2)(c2)(d2) Intensity distribution in the R, G and B channels of (b2); (a3) Raw image of the target; (b3)(c3)(d3) Intensity distribution in the R, G and B channels of (a3); (a4)(b4)(c4)(d4) Intensity statistic of Block I, II, III and IV from (a1) (a2) and (a3) in R, G and B channels.
Fig. 6
Fig. 6 (a) Lmax of words on paper; (b) Finally detected image; (c) and (d) Intensity profiles along the white dotted lines in Figs. 6(a)-6(b); (e) and (f) Pixel distributions of Figs. 6(a)-6(b) in R G and B channels.
Fig. 7
Fig. 7 (a) A raw underwater image of the Mediterranean; (b) Finally detected result.
Fig. 8
Fig. 8 Noise variance in the finally detected image as a function of target distance z and σ 2 .
Fig. 9
Fig. 9 (a) Range map of Fig. 7(a); (b) De-noised image based upon the distance information; (c) and (d) Enlarged view of the marked out part in Fig. 8(b) and Fig. 9(b).

Equations (31)

Equations on this page are rendered with MathJax. Learn more.

L( x,y )= w L(x,y,λ)dλ = w L T ( x,y,λ )+ L B ( x,y,λ )dλ ,
L T ( x,y,λ )= L object ( x,y,λ )exp[ c(λ)z( x,y ) ],
L B ( x,y,λ )= L B ( λ ){ 1exp[ b(λ)z( x,y ) ] },
L( x,y )= w L object ( x,y,λ )exp[ c(λ)z( x,y ) ]+ L B ( λ ){ 1exp[ b(λ)z( x,y ) ] } dλ.
L( x,y,λ )= L T ( x,y,λ )+ L B ( x,y,λ ) = L object ( x,y,λ )exp{ [ b( λ )+a( λ ) ]z( x,y ) }+ L B ( λ ){ 1exp[ b( λ )z( x,y ) ] } = L object ( x,y,λ )exp[ a( λ )z( x,y ) ]exp[ b( λ )z( x,y ) ] + L B ( λ ){ 1exp[ b( λ )z( x,y ) ] } = L object ' ( x,y,λ )exp[ b( λ )z( x,y ) ]+ L B ( λ ){ 1exp[ b( λ )z( x,y ) ] }
L object ' ( x,y,λ )= L object ( x,y,λ )exp[ a( λ )z( x,y ) ],
L B ( x,y,λ )= L B max ( x,y,λ )+ L B min ( x,y,λ ).
L max ( x,y,λ )= L T ( x,y,λ )/2 + L B max ( x,y,λ ) L min ( x,y,λ )= L T ( x,y,λ )/2 + L B min ( x,y,λ ).
p= L B max L B min L B max + L B min .
p( λ )= L B max ( λ )| X void L B min ( λ )| X void L B max ( λ )| X void + L B min ( λ )| X void ,
L B ( λ )= L B max ( λ )| X void + L B min ( λ )| X void .
L B ( x,y,λ )= L max ( x,y,λ ) L min ( x,y,λ ) p( λ ) .
L object ' ( x,y,λ )= L object ( x,y,λ )exp[ a( λ )z( x,y ) ]= L( x,y,λ ) L B ( x,y,λ ) 1[ L B ( x,y,λ )/ L B ( λ ) ] .
f( x,y )= w l( λ ) L object ' ( x,y,λ )s( λ )dλ ,
f( x,y )= w l( λ ) L object ( x,y,λ )exp[ a( λ )z( x,y ) ]s( λ )dλ = w l( λ ) L object ( x,y,λ )d( x,y,λ )s( λ )dλ ,
f( x,y,λ )| λ= λ 0 =l( λ 0 ) L object ( x,y, λ 0 )exp[ a( λ 0 )z( x,y ) ]s( λ 0 ) =l( λ 0 ) L object ( x,y, λ 0 )d( x,y, λ 0 )s( λ 0 ).
L B ( λ ) b( λ ) a( λ ) .
a( λ ) a( λ' ) = b( λ ) L B ( λ' ) b( λ' ) L B ( λ ) =γ,
a( λ' )= ln( 1 L B ( x,y,λ' ) L B ( λ' ) )/z .
d( x,y,λ )=exp[ γa( λ' )z ]=exp[ a( λ )z ].
d( x,y,λ )| λ= λ 0 L object ( x,y,λ )| λ= λ 0 dxdy d( x,y,λ )| λ= λ 0 dxdy =k.
f( x,y,λ )| λ= λ 0 dxdy d( x,y,λ )| λ= λ 0 dxdy = 1 d( x,y, λ 0 )dxdy l( λ 0 ) L object ( x,y, λ 0 )d( x,y, λ 0 )s( λ 0 )dxdy =l( λ 0 )( d( x,y, λ 0 ) L object ( x,y, λ 0 )dxdy d( x,y, λ 0 )dxdy )s( λ 0 ). =l( λ 0 )s( λ 0 )k
l( λ 0 )s( λ 0 )= f( x,y, λ 0 )dxdy d( x,y, λ 0 )dxdy k 1 .
L object ( x,y,λ )| λ= λ 0 = L( x,y, λ 0 ) L B ( x,y, λ 0 ) [ 1 L B ( x,y, λ 0 ) L B ( λ 0 ) ][ f( x,y, λ 0 ) dxdy d( x,y, λ 0 )dxdy k 1 ] d( x,y, λ 0 )dxdy .
L object ( x,y )= w L object ( x,y,λ )dλ .
σ υ 2 = ( υ L min ) 2 σ L min 2 + ( υ L max ) 2 σ L max 2 .
L object = ( 1 1 p ) L max +( 1+ 1 p ) L min [ exp( bz ) ]η ,
η=[ f( x,y, λ 0 ) dxdy d( x,y )dxdy k 1 ] d( x,y )dxdy
σ L object 2 = { 1 1 p [ exp( bz ) ]η } 2 σ L max 2 + { 1+ 1 p [ exp( bz )η ] } 2 σ L min 2 .
σ L object 2 = 2 σ 2 ( 1+ 1 p 2 ) { [ exp( bz ) ]η } 2 .
I dn = F W I n .
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.