Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Skylight polarization patterns under urban obscurations and a navigation method adapted to urban environments

Open Access Open Access

Abstract

Owing to preferable anti-interference and anti-cumulative-error capabilities, polarized skylight navigation technology has been developed. However, in urban environments with extensive demand, the sky is usually partially obscured by buildings and trees. Urban landscape obscurations with polarization patterns that have not been sufficiently studied can greatly influence navigation accuracy. In this paper, we study the polarization patterns generated by obscurations and summarize the impacts of obscured urban sky scenes on the navigation results. We also propose a full-sky polarization imaging navigation method adapted to urban environments. A compact full-sky polarimeter is established, and a specific pattern inpainting algorithm based on convolution operation is proposed to amend the navigation errors caused by obscurations. Among 174 sets of comparative experiments, 90.2% of the extraction results are improved after inpainting, which verifies the effectiveness and robustness of the method. Discussions on the optimization of parameters in the algorithm and the recommended values are also provided. This work offers a novel approach to overcome the impacts of obscurations for urban polarization navigation.

© 2021 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

1. Introduction

Navigation and positioning technology plays an important role in all walks of life. Especially in urban environments, there is a large amount of demand, from mobile robots to autonomous driving [1,2]. The modern urban environment also presents some challenges to conventional navigation methods: High-rise buildings and trees can block the satellite signal and disable the global navigation satellite system. Complex electromagnetic environments can disturb the geomagnetic navigation system [3]. Moreover, the error of the inertial navigation system accumulates over time [4]. Thus, a bionic polarization navigation system which is less susceptible to accumulative errors and electromagnetic interferences, has drawn much attention as an aided method [5].

Bionic polarization navigation is developed inspired by the phenomenon that insects navigate depending on polarized skylight [69]. According to Rayleigh scattering theory, the particles in the atmosphere scatter the unpolarized sunlight and change its polarization state [1012]. An ideal model of the generated skylight polarization pattern which is characterized by the angle of polarization (AOP) and the degree of polarization (DOP) is shown in Fig. 1. The pattern is determined by the position of the sun in the sky. Since the pattern is stable and regular, the solar meridian crossing through the zenith and the sun in AOP distribution can be used as a datum direction for navigation. Therefore, as shown in Fig. 1(d), the solar meridian can be extracted from the AOP distribution observed by the carrier, and the azimuth of the carrier can be determined by the angle ${\theta}$ between its heading direction and the solar meridian [13]. Compared with other navigation methods, the bionic polarization method obtains instantaneous absolute azimuth, so it has excellent characteristics in resisting cumulative errors. However, in urban environments, the sky scene is usually partially obscured by high-rise buildings, trees, and clouds. No matter what the obscuration is, the polarization pattern in the obscured region is no longer generated by atmospheric Rayleigh scattering. Moreover, the location, shape, and size of the obscurations are unpredictable. As a consequence, the obscurations can destroy the regular pattern of polarized skylight randomly, and prevent the correct solar meridian from being extracted. Thus the obtained azimuth of the carrier is wrong and the accuracy of navigation is reduced fundamentally.

 figure: Fig. 1.

Fig. 1. Ideal model of the skylight polarization pattern in the sky. (a) The coordinate system, where O represents the observation position of the carrier, (b) 3D AOP distribution, (c) 3D DOP distribution, (d) AOP distribution observed by the carrier, and (e) DOP distribution observed by the carrier.

Download Full Size | PPT Slide | PDF

To detect the skylight polarization pattern, previously published studies generally fall into two categories: point-source measurement and imaging measurement [14]. The former one can only measure the polarization pattern of one point or a few points from the sky, which makes it extremely vulnerable to obscurations [1517]. The imaging measurement can capture the skylight polarization pattern of a part of or even the entire hemispherical sky region. Given that the pattern in a large region is difficult to completely change or obscure, various types of imaging polarimeters based on multiple cameras, a rotary polarizer, and a liquid crystal variable retarder (LCVR), have been proposed.

The multi-camera polarimeter is comprised of at least three cameras [14,18,19]. Tang et al. [18] extracted the solar meridian through straight-line fitting using a multi-camera polarimeter. They also introduced a pulse coupled neural network for denoising, which helped to resist the interference from obscurations and improve the extraction accuracy. A rotary polarizer equipped with a camera is an alternative structure [2023]. Lu et al. [20] proposed a simple and reliable algorithm to extract the solar meridian based on Hough transform and applied it to a rotary polarizer device. Guan et al. [23] also utilized this type of polarimeter to study the skylight polarization patterns over the sea and explored the possibility of marine polarization navigation. But the polarizer needs adjustments within a single measurement, which might lead to the synchronization problem among images during mobile navigation. The polarimeter based on LCVR can realize rapid detection [2426]. Zhao et al. [26] studied the effect of aerosol optical thickness and clouds on the skylight polarization pattern using an LCVR polarimeter. They found that the symmetry of the AOP distribution remains steady under most of the sky conditions, and thereby proposed a navigation algorithm to find out the solar meridian. Moreover, many researchers studied the influences of different weather conditions on the skylight polarization pattern, and various navigation algorithms were proposed to address these impacts [27,28]. However, existing publications pay less attention to typical urban landscape obscurations such as buildings and trees. Owing to complex optical processes including scatterings and reflections, it is difficult to simulate polarization patterns of these obscurations. The obscured polarization patterns remain unknown, and the corresponding impacts on the navigation results need further research. Given that obscurations can cause damage to the skylight polarization pattern, the reliability of existing methods is hard to guarantee, and a specific navigation method is required.

In this paper, we study the effects of urban obscurations on polarization navigation and propose a full-sky polarization imaging navigation method suitable for applications under urban conditions. We introduce a compact polarimeter to detect the skylight polarization pattern with a hemispheric FOV and high speed. Using this instrument, we study the polarization patterns generated by typical obscurations in urban environments, and analyze the navigation accuracy of sky scenes with different extent of obscurations. We also propose a pattern inpainting algorithm based on the convolution operation adapted to partially obscured sky scenes. Through the proposed algorithm, the incorrect polarization patterns generated by obscurations are removed and the correct skylight polarization patterns in the obscured regions are restored. To verify the effectiveness and robustness of the method, we perform comparative experiments and provide some discussions to explore the optimization of parameters in the algorithm. A conclusion is drawn in the last section.

2. Full-sky polarization imaging navigation method

The flowchart of the proposed full-sky polarization imaging navigation method is shown in Fig. 2. Initially, a polarization image of the sky scene is collected with the full-sky imaging polarimeter. Subsequently, the collected image is processed by the navigation algorithm. The navigation algorithm is designed to extract the correct solar meridian even when the sky is partially obscured, and thus to obtain the correct azimuth of the carrier. In the algorithm, the polarization pattern is calculated from the image on the basis of the Stokes vector. Then, the pattern inpainting is the process of removing the incorrect polarization patterns generated by obscurations and restoring the correct skylight polarization patterns in the obscured regions. The obscured regions are screened out by a generated mask, and the skylight polarization pattern is restored using a convolution operation. Lastly, the solar meridian is extracted from the inpainted skylight polarization pattern via the Hough transform.

 figure: Fig. 2.

Fig. 2. Flowchart of the proposed full-sky polarization imaging navigation method.

Download Full Size | PPT Slide | PDF

2.1 Full-sky polarization imaging polarimeter

To detect the skylight polarization pattern, a compact full-sky polarization imaging polarimeter is set up. The typical internal structure of the polarimeter is shown in Fig. 3(a), which is composed of a fisheye objective and a polarization complementary metal oxide semiconductor (CMOS) image detector. A fisheye objective with a hemispheric FOV is placed in front to capture images of the full sky. Each pixel unit in the CMOS detector is covered by a nano wire-grid polarizer with a fixed direction, and all the polarizers form an array. Four adjacent pixel units in the array are regarded as a superpixel, and the polarization angles of the polarizers among the superpixel are 90°, 45°, 135°, and 0°. When the incident light reaches the polarizer, the portion with a polarization direction perpendicular to the wire-grid axis passes and is received by the photodiode in CMOS, whereas the parallel portion is reflected and absorbed. Therefore, four polarization images with different polarization angles can be obtained at a single shoot utilizing the pixel segmentation algorithm shown in Fig. 3(b).

 figure: Fig. 3.

Fig. 3. (a) Typical diagram of the internal structure of the full-sky polarization imaging polarimeter. The pixels with polarization angles of 90°, 45°, 135°, and 0° are represented by red, yellow, green, and blue, respectively. (b) Schematic of the pixel segmentation algorithm. According to the structure of the polarizer array, a superpixel is divided into four pixel units, which are represented by ${I_{out}}({90^ \circ })$, ${I_{out}}({45^ \circ })$, ${I_{out}}({135^ \circ })$, ${I_{out}}({0^ \circ })$, respectively. The pixel units with the same polarization angle are combined to form four images.

Download Full Size | PPT Slide | PDF

The proposed polarimeter can detect the skylight polarization pattern with a hemispheric FOV and a preferable real-time performance. On account of the unpredictable locations, shapes, and sizes of obscurations, the hemispheric FOV helps reduce the proportion of the obscured area in the image and thus improve the ratio of available skylight polarization pattern. Given that the proposed polarimeter can detect the skylight polarization pattern with a single shoot, it can also eliminate nonsynchronous errors caused by the rotary polarizer and is more suitable for mobile navigation. In addition, compared with other polarimeters, the proposed structure is more compact and easier to operate.

2.2 Navigation algorithm

Although the polarization image is acquired with a large FOV, the effects of obscurations still exist. If orientation information is extracted from the unprocessed polarization pattern, the accuracy and reliability are difficult to guarantee. Therefore, we propose a navigation algorithm to inpaint the skylight polarization pattern and amend the navigation errors caused by urban landscape obscurations.

The flowchart of the navigation algorithm for partially obscured urban sky scenes is shown in Fig. 4. The detailed steps of this algorithm are provided as follows.

 figure: Fig. 4.

Fig. 4. Flowchart of the navigation algorithm for a partially obscured sky scene. (a) AOP distribution, (b) DOP distribution, (c) MASK diagram, (d) valid AOP distribution, (e) inpainted AOP distribution, (f) transform results in H space, and (g) extraction result of the solar meridian as a white solid line.

Download Full Size | PPT Slide | PDF

Step 1: The skylight polarization pattern is calculated on the basis of the Stokes vector from the segmented polarization images [29,30]. The AOP and DOP values of each superpixel are calculated using Eq. (1) [23]:

$$\left\{ {\begin{array}{*{20}{l}} {AOP = \frac{1}{2}\arctan \left( {\frac{U}{Q}} \right)}\\ {DOP = \frac{{\sqrt {{Q^2} + {U^2}} }}{I}} \end{array}} \right., $$
where, I, Q, and U represent the first three components of the Stokes vector of the scattered light and can be calculated using Eq. (2) [31]. Since there is almost no circular polarized component in the skylight [10], the last component of the Stokes vector is negligible.
$$\left\{ {\begin{array}{*{20}{l}} {I = {I_{out}}({0^ \circ }) + {I_{out}}({{90}^ \circ })}\\ {Q = {I_{out}}({0^ \circ }) - {I_{out}}({{90}^ \circ })}\\ {U = 2 \times {I_{out}}({{45}^ \circ }) - {I_{out}}({0^ \circ }) - {I_{out}}({{90}^ \circ })} \end{array}} \right.. $$

The result of each superpixel is arranged in arrays, and then AOP distribution and DOP distribution are formed as shown in Figs. 4(a) and (b).

Step 2: A MASK is generated to distinguish invalid areas from the sky areas by setting thresholds for DOP and the DOP gradient.

Under- and overexposed areas manifest as excessively low values in DOP distribution [32,33]. Moreover, almost no natural light scattered by the atmosphere is completely polarized [10]. Thus, the corresponding lower and upper limitations for DOP should be set to exclude these impacts. The DOP gradient usually remains low across the full sky. However, when the sky is blocked by obscurations or clouds, disruption may occur in DOP distribution, which leads to a large gradient [19,34,35]. Thus, the upper limitation for the DOP gradient should be set to exclude obscured areas.

For the pixel at the coordinate $(i,j)$, the validity of the pixel can be determined following the criteria described in Eq. (3):

$$MASK(i,j) = \left\{ {\begin{array}{*{20}{c}} 1&,&{DO{P_{\min }}\mathrm{\ < }DOP(i,j)\mathrm{\ < }DO{P_{\max }}}\\ {}&{}&{\textrm{and }gradient(DOP(i,j))\mathrm{\ < }{G_{\max }}\textrm{ }}\\ 0&,&{\textrm{other}} \end{array}} \right., $$
where $DOP(i,j)$ and $gradient(DOP(i,j))$ represent the DOP and the DOP gradient at $(i,j)$, respectively. $DO{P_{\min }}$ and $DO{P_{\max }}$ represent the lower limitation and the upper limitation of the DOP. ${G_{\max }}$ represents the upper limitation of the DOP gradient. If the DOP and the DOP gradient meet the standards, the pixel is regarded as a valid pixel and reserved, and $MASK(i,j)$ is set to 1. Otherwise, we regard the pixel as invalid and set $MASK(i,j)$ to 0. The original $AOP(i,j)$ is discarded and amended through the following steps. This is a straight-forward algorithm and does not require iteration.

Thus, a MASK matrix covering only valid sky regions and excluding obscurations is generated pixel by pixel. The three thresholds $DO{P_{\min }}$, $DO{P_{\max }}$ and ${G_{\max }}$ together determine the removal effect of obscurations, and the detailed discussions are in Sec. 4. For instance, the generated MASK matrix is shown in Fig. 4(c) when $DO{P_{\min }}$, $DO{P_{\max }}$, and ${G_{\max }}$ are 0.02, 0.9, and 0.02, respectively. Herein, $MASK(i,j)$ of the black area equals 0, whereas that of the white area equals 1. The valid skylight polarization pattern screened out by the MASK matrix is shown in Fig. 4(d).

Step 3: For the invalid areas excluded by the MASK, the skylight polarization pattern is inpainted.

According to Rayleigh scattering theory, the skylight polarization pattern is determined by the relative position of the scattering particles and the sun, which means that a strong correlation exists between the AOP value of a single pixel and its adjacent pixels. In fact, the AOP of the ideal skylight polarization pattern is continuous [36]. This property inspired us to inpaint the obscured region in AOP distribution on the basis of its neighborhood. The image convolution operation [37], which changes the value of a single pixel according to its neighborhood, is well suited for this purpose. Considering that $MASK(i,j)$ only equals 1 in valid areas and equals 0 in invalid areas, using a neighbor matrix of $MASK(i,j)$ as a convolution kernel can eliminate the influences of the invalid pixels naturally.

For each invalid pixel at the coordinate $(i,j)$,where $MASK(i,j)$ equals 0 and the original $AOP(i,j)$ is discarded, $AO{P_{inpainting}}(i,j)$ is calculated using Eq. (4) and filled in the AOP distribution. The diagram of this process is shown in Fig. 5.

$$AO{P_{inpainting}}(i,j) = \frac{{AO{P_{pad(i,j)}} \otimes MAS{K_{pad(i,j)}}}}{{sum(MAS{K_{pad(i,j)}})}}. $$

 figure: Fig. 5.

Fig. 5. Principle of inpainting AOP distribution through convolution. To make the principle easier to understand, the r in this diagram is 1.

Download Full Size | PPT Slide | PDF

As described in Eqs. (5) and (6), $AO{P_{pad}}(i,j)$ is a neighbor matrix near the coordinate $(i,j)$ in the AOP distribution, and $MAS{K_{pad}}(i,j)$ is the same as that in MASK. $sum(MAS{K_{pad(i,j)}})$ represents the sum of all members of the latter matrix. r represents the size of the neighbor matrix.

$$AO{P_{pad(i,j)}} = \left[ {\begin{array}{*{20}{c}} {AOP(i - r,j - r)}& \cdots &{AOP(i - r,j + r)}\\ \vdots & \ddots & \vdots \\ {AOP(i + r,j - r)}& \cdots &{AOP(i + r,j + r)} \end{array}} \right]. $$
$$MAS{K_{pad(i,j)}} = \left[ {\begin{array}{*{20}{c}} {MASK(i - r,j - r)}& \cdots &{MASK(i - r,j + r)}\\ \vdots & \ddots & \vdots \\ {MASK(i + r,j - r)}& \cdots &{MASK(i + r,j + r)} \end{array}} \right]. $$

In addition, the result of the convolution operation is calculated as Eq. (7):

$$AO{P_{pad(i,j)}} \otimes MAS{K_{pad(i,j)}} = \sum\limits_{m = 1}^{2r + 1} {\sum\limits_{n = 1}^{2r + 1} {AO{P_{pad(i,j)}}(m,n)} } MAS{K_{pad(i,j)}}(m,n). $$

The parameter r, which determines the reference scope of the convolution operation, greatly affects the inpainting effect. The detailed discussions are in Sec. 4. For instance, the inpainted AOP distribution with r equal to 10 is shown in Fig. 4(e).

Step 4: The solar meridian is extracted from the inpainted AOP distribution. Because the AOP values along the solar meridian are close to 90° or −90°, feature points can be selected and the Hough transform [20,38] is conducted to figure out the direction of the solar meridian.

A threshold method is used to find the feature points with AOP values close to 90° or −90°, and the tolerance of the threshold is set to 0.2° as Eq. (8):

$$- {90^ \circ } \le AO{P_{inpainting}}(i,j) \le - {89.8^ \circ }\textrm{or 89}\textrm{.}{\textrm{8}^ \circ } \le AO{P_{inpainting}}(i,j) \le {90^ \circ }. $$

Then, the Hough transform is applied to those selected points, and the result in H space is shown in Fig. 4(f). The abscissa of the highest peak, 83°, represents the direction angle of the solar meridian. The extraction result of the solar meridian is shown with the white solid line in Fig. 4(g).

3. Experiments and results

To study the polarization patterns generated by obscurations and verify the robustness of the proposed navigation method, we conducted outdoor experiments under 174 different obscured conditions between January 11, 2021 and January 23, 2021. The experimental site was in Beijing Institute of Technology (geographical coordinates: 116°19'15"E, 39°57'55"N), and the weather condition was sunny with good air quality.

3.1 Experimental system

The experimental system is shown in Fig. 6, in which the full-sky polarization imaging polarimeter was used for data acquisition. The computer was used for data processing and storage. The Power over Ethernet (PoE) module supplied power to the polarimeter and transmitted data between the polarimeter and the computer. The 220 V power module was used to supply the computer and PoE. Because of the cold weather, we placed the power module in a foam box for safety.

 figure: Fig. 6.

Fig. 6. Experimental system.

Download Full Size | PPT Slide | PDF

The overall size of the full-sky polarization imaging polarimeter was 30 mm×30 mm×85 mm (length×width×height), consisting of a fisheye lens FE185C057HA-1 from FUJIFILM, and a CMOS polarization image detector IMX250MYR from SONY. The focal length of the fisheye lens was 1.8 mm, with the effective angle of FOV equaling 185°×185° (H×V). The CMOS chip was composed of 2448×2048 pixel units, and the size of each pixel unit was 3.45 µm×3.45 µm. The polarizers covering the CMOS surface had the same pixel size as the CMOS chip. The optical axis of the polarimeter was adjusted to point vertically towards the sky using a gradienter.

3.2 Urban scenes explorations

In this section, we study the AOP and DOP distribution generated by typical urban obscurations: buildings and trees. On this basis, the obscured urban sky scenes are classified and explored, and the accuracy of the proposed method is verified.

The typical polarization pattern of buildings is illustrated in Fig. 7, where buildings are divided into walls and glass for discussion. In the AOP distribution shown in Fig. 7(b), buildings mainly perform as high-frequency noise points where some contours are located or reflection situation changes, especially at the edges of buildings and the boundaries between walls and glass. Given that the solar meridian is characterized with the AOP close to 90° or −90°, these noises can greatly influence the extraction result. Whereas walls usually approximate to zero without messy noise, such as in zone 1. Owing to the reflection of the sunlight, the AOP of glass varies greatly in zone 2, and fewer noises exist in the glass region. As for the DOP distribution in Fig. 7(c), walls tend to exhibit relatively uniform and low DOP values, such as zone 3. Exceptions appear in the wall near the sun azimuth in zone 4, where DOP values increase. The DOP of glass also varies greatly due to the reflections, as shown in zone 5.

 figure: Fig. 7.

Fig. 7. Influence of dense building obscurations on the skylight polarization pattern. (a) One of the original polarization images, (b) AOP distribution, (c) DOP distribution.

Download Full Size | PPT Slide | PDF

The typical polarization pattern of trees is shown in Fig. 8. The interference caused by trees is determined by their sizes to some extent. From zone 1 to zone 3 in Fig. 8(b), the sizes of branches increase, and the noises in AOP distribution increase as well. The thinnest branches in zone 1 generate only textured noises, which do not change the skylight polarization pattern. The thicker trunks, as shown in zone 2 and zone 3, produce denser noise points and disrupt the regularity of the pattern. In the DOP distribution in Fig. 8(c), trees show a low DOP value generally. But the DOP value at their junction with the sky is higher, and the outline of branches is highlighted.

 figure: Fig. 8.

Fig. 8. Influence of dense tree obscurations on the skylight polarization pattern. (a) One of the original polarization images, (b) AOP distribution, (c) DOP distribution.

Download Full Size | PPT Slide | PDF

Based on the typical polarization patterns of urban obscurations analyzed above, we classify the urban sky scenes into three categories: mild obscuration, moderate obscuration, and severe obscuration. Extraction results of the solar meridian under different obscured conditions are shown in Fig. 9, and the impacts caused by obscurations are summarized. We also give the results after inpainting here, which illustrate the effectiveness of the proposed method.

 figure: Fig. 9.

Fig. 9. Comparisons of AOP results before and after inpainting. From left to right, the figures show the extraction result before inpainting, H space before inpainting, the extraction result after inpainting, and H space after inpainting. The white solid lines are the extracted solar meridians. (a) Under mild obscuration, (b)–(d) under moderate obscuration, and (e)–(g) under severe obscuration.

Download Full Size | PPT Slide | PDF

Under the condition of mild obscuration, as shown in Fig. 9(a), the proportion of obscurations in the sky scene is lower than 40%. The experimental site is open, and few obscurations exist on the outer edge of the sky scene. The large-scale valid sky region manifests as clean AOP distribution. Although high-frequency noises caused by outer obscurations form some interference peaks in H space, the solar meridian is extracted correctly from the original AOP distribution. For this kind of obscured condition, the proposed algorithm can remove the noises and restore the clean AOP distribution. Moreover, side peaks in H space are removed completely, and the correct peak is reserved.

The moderate obscured sky scenes with 40%–55% of obscurations are shown in Figs. 9(b)–(d). In Fig. 9(b), the obscurations are far away from the experimental site and not too high so they do not occupy a large sky region. Besides, under such conditions as shown in Figs. 9(c) and (d), although a large region is obscured, the obscurations are strip-shaped objects, such as railings and gridding, in the gap of which small sky pieces can be exposed. The obscurations generate a large number of noise points in the AOP distribution, and some side peaks are even higher than the correct peak in H space. For this kind of obscured condition, the proposed algorithm can still achieve satisfactory results. The majority of noises can be removed, and a relatively pure and smooth AOP distribution can be restored. In H space, invalid side peaks are cut down, such that the correct peak is well retained and highlighted. The solar meridian is also extracted accurately.

Under severe obscured conditions presented in Figs. 9(e)–(g), the experimental sites are between high-rise buildings or dense trees. Obscurations take up more than 55% of the sky scene, and the extraction accuracy is greatly impacted. In this case, due to the insufficiency of the original information, the proposed method cannot restore pure and smooth AOP distribution. The interference caused by glass reflection is not resolved. Nevertheless, it still can reduce high-frequency noises and destroy the contour of obscurations. The height of interference peaks is cut down in H space, which increases the probability that the correct solar meridian is finally extracted.

In summary, the results demonstrate that the method we proposed can improve the accuracy of the direction extraction under various obscured extents. Compared with the extraction results without inpainting, high-frequency noises are reduced specifically and the invalid side peaks are cut down in H space. Moreover, the proposed algorithm is highly sensitive to the contour of the obscurations and well adapted to the characteristics of urban obscurations.

3.3 Statistical results

To verify the robustness and environmental adaptability of the proposed method, we compare the extraction results of 174 sets of collected data before and after inpainting according to the following rules. If the solar meridian can be extracted correctly after inpainting but incorrectly before inpainting, it is considered that the extraction result after inpainting is improved. Otherwise, H space is used, and if the ratio of the height of the correct peak to that of the interference peak increases after inpainting, it is considered that the extraction result is improved as well. In all the other conditions, the extraction result after inpainting is regarded as the same as or worse than that before inpainting.

The statistical results are shown in Table 1, among the 174 sets of data, 157 sets present that the extraction results after inpainting are better than those before inpainting, accounting for 90.2%. 15 sets are the same before and after inpainting, accounting for 8.6%. Only 2 sets perform worse after inpainting, accounting for 1.2%. The statistical results show that the proposed inpainting algorithm works well in the majority of urban environments, and possesses good robustness and environmental adaptability.

Tables Icon

Table 1. Statistical Results of the Comparative Experiments

4. Discussions

To improve the performance of this method under urban obscurations, we conduct a set of experiments to determine the optimization of parameters. The navigation algorithm comprises four adjustable parameters: ${G_{\max }}$, $DO{P_{\max }}$, $DO{P_{\min }}$, and r.

4.1 Optimization of ${G_{\max }}$

${G_{\max }}$, as a key parameter of the threshold criteria in Eq. (3), determines the removal effect of the algorithm on obscurations. Taking the polarization image shown in Fig. 10(a) for instance, we change ${G_{\max }}$ and collect the error rates of the MASKs in the obscuration and sky areas. In this study, the error rate equals the ratio of the number of the screening error points over that of total points. As the statistical results in Fig. 10(e) shows, the opposite trends of the two lines indicate that the optimization of ${G_{\max }}$ is a balance between removing obscurations and preserving sky areas. Moreover, the different slopes of the lines deserve noting. The decrease of the error rate in the sky region (red line) slows greatly whereas the increasing speed of the error rate in the obscured region (blue line) is almost unchanged. As ${G_{\max }}$ increases, the error rate of the sky region decreases rapidly at first. After reaching a threshold, the change of error rate in the sky region slows down to 0 while the error rate of the obscured region is still increasing. Thus, the threshold value 0.017 in Fig. 10(e) is appropriate for ${G_{\max }}$. Since the calculation of ${G_{\max }}$ is related to the resolution and FOV of the imaging polarimeter, the value is only applicable to this system.

 figure: Fig. 10.

Fig. 10. MASKs generated with different ${G_{\textrm{max} }}$ and the error rates of the MASKs. (a) One of the original polarization images; (b)–(d) MASKs generated when ${G_{\textrm{max} }}$ equals to 0.005, 0.017, and 0.025, respectively; (e) error rates of the MASKs depending on ${G_{\textrm{max} }}$.

Download Full Size | PPT Slide | PDF

The results are supported by the features shown in Figs. 10(b)–(d), which exhibit the MASKs when ${G_{\max }}$ equals 0.005, 0.017, and 0.025, respectively. Among them, the MASK in Fig. 10(b) filters the obscurations best, but it excessively removes valid pixels. These misclassified points intersperse the full sky region, thus exerting a negative impact on navigation. On the contrary, the MASK in Fig. 10(d) preserves the sky well but fails to remove the noises from the obscurations effectively. Only the MASK shown in Fig. 10(c) with ${G_{\max }}$ equal to 0.017 takes both aspects into account, whereby high-frequency noises on obscurations are removed and the sky is preserved.

4.2 Optimization of $DO{P_{\min }}$

$DO{P_{\max }}$ and $DO{P_{\min }}$ are the upper and lower limits of the valid DOP in Eq. (3). As discussed in Sec. 2.2, $DO{P_{\max }}$ is set to remove the completely polarized regions in the polarization pattern, so it can generally take a value of 0.9. $DO{P_{\min }}$ is set to remove the under- and overexposed areas. Similar to the analyses in Sec. 4.1, MASKs generated with different $DO{P_{\min }}$ and the error rates of the MASKs are displayed in Fig. 11. As suggested in Fig. 11(e), small $DO{P_{\min }}$ suggests that this is a loose threshold criterion, such that the sky area can be well reserved, but the under- and overexposed areas cannot be effectively removed. Exceedingly large $DO{P_{\min }}$ leads to the opposite. The absolute values of the slopes of these two lines are similar, so $DO{P_{\min }}$ around the point of intersection is adaptive. Accordingly, 0.016 is a suitable value specific to this scene.

 figure: Fig. 11.

Fig. 11. MASKs generated with different $DO{P_{\min }}$ and the error rates of the MASKs. The original polarization images are the same as that in Fig. 10. (a)–(d) MASKs generated when $DO{P_{\min }}$ equals to 0.01, 0.016, 0.02, and 0.025, respectively; and (e) error rates of the MASKs depending on $DO{P_{\min }}$.

Download Full Size | PPT Slide | PDF

The removal effects of under- and overexposed areas shown in Figs. 11(a)–(d) are consistent with the above discussion. The MASKs generated are visualized in Figs. 11(a)–(d) when $DO{P_{\min }}$ equals to 0.01, 0.016, 0.02, and 0.025, respectively. Among them, the MASK in Fig. 11(a) cannot effectively remove pixels with abnormal exposure, whereas the MASKs in Figs. 11(c) and (d) remove excessive sky areas. Only the MASK with $DO{P_{\min }}$ equal to 0.016 shown in Fig. 11(b) can remove as many noise points as possible under the condition of reserving the sky area.

4.3 Optimization of $r$

$r$ is a key parameter in Eqs. (5) and (6), which mainly determines the effect of pattern inpainting. Considering the streetlamps shown in Fig. 12(a) for example, we explore the optimization of r. This is a 45×45-pixel area that contains three street lamps, and each lamp has a size of approximately 10×10 pixels. The distribution of AOP is shown in Fig. 12(b), and the generated MASK is shown in Fig. 12(c) when $DO{P_{\min }}$, $DO{P_{\max }}$, and ${G_{\max }}$ are equal to 0.02, 0.9, and 0.02, respectively.

 figure: Fig. 12.

Fig. 12. Streetlamps as a typical obscuration. (a) Polarization image, (b) AOP distribution, and (c) MASK.

Download Full Size | PPT Slide | PDF

We inpaint the invalid area screened out by the MASK with different r values, and the results are shown in Fig. 13. In Figs. 13(a) and (b), when r is excessively less than the obscuration size, the proposed algorithm fails to inpaint all the invalid pixels owing to the lack of reference. In Figs. 13(c) and (d), the reference range extends with increasing r, such that the results become smooth and close to the true value. When r becomes greater than the obscuration size in Figs. 13(e) and (f), a few pixels are calculated erroneously, and the pattern is over-inpainted. The reason is that some points far from the target pixel are introduced into the reference, thus causing more errors instead. Accordingly, the inpainting algorithm works best when r matches the size of obscurations.

 figure: Fig. 13.

Fig. 13. Inpainting results with different r values. (a) $r = 1$, (b) $r = 2$, (c) $r = 3$, (d) $r = 5$, (e) $r = 10$, and (f) $r = 15$.

Download Full Size | PPT Slide | PDF

Assuming that the size of the area to be inpainted is m×n pixels, r should at least meet the following requirement as in Eq. (9). If the size of the obscuration in the image changes, r can adaptively change to achieve the best effect.

$$\frac{1}{2}\min \{{m,n} \}\mathrm{\ < }r\mathrm{\ < }\max \{{m,n} \}. $$

5. Conclusions

In this paper, we have studied the polarization patterns of urban obscurations and propose a full-sky polarization imaging navigation method adapted to urban environments, which is comprised of a compact polarimeter and a pattern inpainting algorithm. The typical urban obscurations, such as buildings and trees, are studied with the polarimeter and the characteristics of the generated polarization patterns are analyzed. We found that obscured sky scenes can be clarified into three categories according to the obscuration extent. We conducted comparative experiments under different obscured conditions, and 90.2% of the navigation results are improved after inpainting, which verifies the robustness and environmental adaptability of the proposed method. Moreover, the optimization of parameters in the algorithm was discussed and the reference values are provided.

Although our work provides an approach to improving the accuracy of polarization navigation in urban environments, some limitations still exist. In subsequent research, how to promote our algorithm to resist the glass reflection of buildings might be a focus. Furthermore, on the basis of the idea of inpainting skylight polarization pattern proposed in this paper, more image-processing methods may be employed in the future.

Funding

National Natural Science Foundation of China (51735002, 61905014).

Acknowledgments

The authors appreciate the valuable help from Dr. Ying Zhang.

Disclosures

The authors declare no conflicts of interest.

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

References

1. G. N. DeSouza and A. C. Kak, “Vision for mobile robot navigation: A survey,” IEEE Trans. Pattern Anal. Mach. Intell. 24(2), 237–267 (2002). [CrossRef]  

2. I. Skog and P. Händel, “In-car positioning and navigation technologiesa survey,” IEEE Trans. Intell. Transp. Syst. 10(1), 4–21 (2009). [CrossRef]  

3. Y. Wu and S. Luo, “On misalignment between magnetometer and inertial sensors,” IEEE Sens. J. 16(16), 6288–6297 (2016). [CrossRef]  

4. K. Tin Leung, J. F. Whidborne, D. Purdy, and A. Dunoyer, “A review of ground vehicle dynamic state estimations utilising GPS/INS,” Veh. Syst. Dyn. 49(1-2), 29–58 (2011). [CrossRef]  

5. D. Lambrinos, H. Kobayashi, R. Pfeifer, M. Maris, T. Labhart, and R. Wehner, “An autonomous agent navigating with a polarized light compass,” Adapt. Behav. 6(1), 131–161 (1997). [CrossRef]  

6. M. Muller and R. Wehner, “Path integration in desert ants, Cataglyphis fortis,” Proc. Natl. Acad. Sci. 85(14), 5287–5290 (1988). [CrossRef]  

7. S. Rossel and R. Wehner, “How bees analyse the polarization patterns in the sky - Experiments and model,” J. Comp. Physiol. A 154(5), 607–615 (1984). [CrossRef]  

8. S. M. Reppert, H. Zhu, and R. H. White, “Polarized Light Helps Monarch Butterflies Navigate,” Curr. Biol. 14(2), 155–158 (2004). [CrossRef]  

9. M. Dacke, E. Baird, M. Byrne, C. H. Scholtz, and E. J. Warrant, “Dung beetles use the milky way for orientation,” Curr. Biol. 23(4), 298–300 (2013). [CrossRef]  

10. K. Coulson, Polarization and Intensity of Light in the Atmosphere (A. Deepak Pub., 1988).

11. A. J. Brown, “Spectral bluing induced by small particles under the Mie and Rayleigh regimes,” Icarus 239, 85–95 (2014). [CrossRef]  

12. A. J. Brown, T. I. Michaels, S. Byrne, W. Sun, T. N. Titus, A. Colaprete, M. J. Wolff, G. Videen, and C. J. Grund, “The case for a modern multiwavelength, polarization-sensitive LIDAR in orbit around Mars,” J. Quant. Spectrosc. Radiat. Transf. 153, 131–143 (2015). [CrossRef]  

13. B. Suhai and G. Horváth, “How well does the Rayleigh model describe the E-vector distribution of skylight in clear and cloudy conditions? A full-sky polarimetric study,” J. Opt. Soc. Am. A 21(9), 1669–1676 (2004). [CrossRef]  

14. D. Wang, H. Liang, H. Zhu, and S. Zhang, “A bionic camera-based polarization navigation sensor,” Sensors 14(7), 13006–13023 (2014). [CrossRef]  

15. D. Lambrinos, R. Möller, T. Labhart, R. Pfeifer, and R. Wehner, “Mobile robot employing insect strategies for navigation,” Rob. Auton. Syst. 30(1-2), 39–64 (2000). [CrossRef]  

16. C. Jinkui, Z. Kaichun, Z. Qiang, and W. Tichang, “Construction and performance test of a novel polarization sensor for navigation,” Sensors Actuators A Phys. 148(1), 75–82 (2008). [CrossRef]  

17. J. Chahl and A. Mizutani, “Biomimetic attitude and orientation sensors,” IEEE Sens. J. 12(2), 289–297 (2012). [CrossRef]  

18. J. Tang, N. Zhang, D. Li, F. Wang, B. Zhang, C. Wang, C. Shen, J. Ren, C. Xue, and J. Liu, “Novel robust skylight compass method based on full-sky polarization imaging under harsh conditions,” Opt. Express 24(14), 15834–15844 (2016). [CrossRef]  

19. C. Fan, X. Hu, X. He, L. Zhang, and J. Lian, “Integrated Polarized Skylight Sensor and MIMU with a Metric Map for Urban Ground Navigation,” IEEE Sens. J. 18(4), 1714–1722 (2018). [CrossRef]  

20. H. Lu, K. Zhao, Z. You, and K. Huang, “Angle algorithm based on Hough transform for imaging polarization navigation sensor,” Opt. Express 23(6), 7248–7262 (2015). [CrossRef]  

21. H. Lu, K. Zhao, Z. You, and K. Huang, “Real-time polarization imaging algorithm for camera-based polarization navigation sensors,” Appl. Opt. 56(11), 3199–3205 (2017). [CrossRef]  

22. Y. Wang, X. Hu, J. Lian, L. Zhang, Z. Xian, and T. Ma, “Design of a device for sky light polarization measurements,” Sensors 14(8), 14916–14931 (2014). [CrossRef]  

23. L. Guan, S. Li, L. Zhai, S. Liu, H. Liu, W. Lin, Y. Cui, J. Chu, and H. Xie, “Study on skylight polarization patterns over the ocean for polarized light navigation application,” Appl. Opt. 57(21), 6243–6251 (2018). [CrossRef]  

24. Y. Zhang, H. Zhao, P. Song, S. Shi, W. Xu, and X. Liang, “Ground-based full-sky imaging polarimeter based on liquid crystal variable retarders,” Opt. Express 22(7), 8749–8764 (2014). [CrossRef]  

25. Y. Zhang, H. Zhao, and N. Li, “Polarization calibration with large apertures in full field of view for a full Stokes imaging polarimeter based on liquid-crystal variable retarders,” Appl. Opt. 52(6), 1284–1292 (2013). [CrossRef]  

26. H. Zhao, W. Xu, Y. Zhang, X. Li, H. Zhang, J. Xuan, and B. Jia, “Polarization patterns under different sky conditions and a navigation method based on the symmetry of the AOP map of skylight,” Opt. Express 26(22), 28589–28603 (2018). [CrossRef]  

27. M. Hamaoui, “Polarized skylight navigation,” Appl. Opt. 56(3), B37–B46 (2017). [CrossRef]  

28. H. Liang, H. Bai, N. Liu, and X. Sui, “Polarized skylight compass based on a soft-margin support vector machine working in cloudy conditions,” Appl. Opt. 59(5), 1271–1279 (2020). [CrossRef]  

29. W. McMaster, “Polarization and the Stokes Parameters,” Am. J. Phys. 22(6), 351–362 (1954). [CrossRef]  

30. Q. Li, Y. Hu, S. Zhang, J. Cao, and Q. Hao, “Calibration and image processing method for polarized skylight sensor,” Proc. SPIE 115500F1 (2020). [CrossRef]  

31. A. J. Brown, “Equivalence relations and symmetries for laboratory, LIDAR, and planetary Müeller matrix scattering geometries,” J. Opt. Soc. Am. A 31(12), 2789–2794 (2014). [CrossRef]  

32. G. Horváth, A. Barta, J. Gál, B. Suhai, and O. Haiman, “Ground-based full-sky imaging polarimetry of rapidly changing skies and its use for polarimetric cloud detection,” Appl. Opt. 41(3), 543–559 (2002). [CrossRef]  

33. N. J. Pust and J. A. Shaw, “Digital all-sky polarization imaging of partly cloudy skies,” Appl. Opt. 47(34), H190–H198 (2008). [CrossRef]  

34. K. N. Liou and C. Bohren, “An Introduction to Atmospheric Radiation,” Phys. Today 34(7), 66–67 (1981). [CrossRef]  

35. E. McCartney and F. Hall, “Optics of the Atmosphere: Scattering by Molecules and Particles,” Phys. Today 30(5), 76–77 (1977). [CrossRef]  

36. S. Sun, J. Gao, D. Wang, T. Yang, and X. Wang, “Improved models of imaging of skylight polarization through a fisheye lens,” Sensors 19(22), 4844 (2019). [CrossRef]  

37. S. Park and R. Schowengerdt, “IMAGE-RECONSTRUCTION BY PARAMETRIC CUBIC CONVOLUTION,” Comput. Vision Graph. Image Process. 23(3), 258–272 (1983). [CrossRef]  

38. R. Duda and P. Hart, “Use of the Hough Transformation To Detect Lines and Curves in Pictures,” Commun. ACM 15(1), 11–15 (1972). [CrossRef]  

References

  • View by:

  1. G. N. DeSouza and A. C. Kak, “Vision for mobile robot navigation: A survey,” IEEE Trans. Pattern Anal. Mach. Intell. 24(2), 237–267 (2002).
    [Crossref]
  2. I. Skog and P. Händel, “In-car positioning and navigation technologiesa survey,” IEEE Trans. Intell. Transp. Syst. 10(1), 4–21 (2009).
    [Crossref]
  3. Y. Wu and S. Luo, “On misalignment between magnetometer and inertial sensors,” IEEE Sens. J. 16(16), 6288–6297 (2016).
    [Crossref]
  4. K. Tin Leung, J. F. Whidborne, D. Purdy, and A. Dunoyer, “A review of ground vehicle dynamic state estimations utilising GPS/INS,” Veh. Syst. Dyn. 49(1-2), 29–58 (2011).
    [Crossref]
  5. D. Lambrinos, H. Kobayashi, R. Pfeifer, M. Maris, T. Labhart, and R. Wehner, “An autonomous agent navigating with a polarized light compass,” Adapt. Behav. 6(1), 131–161 (1997).
    [Crossref]
  6. M. Muller and R. Wehner, “Path integration in desert ants, Cataglyphis fortis,” Proc. Natl. Acad. Sci. 85(14), 5287–5290 (1988).
    [Crossref]
  7. S. Rossel and R. Wehner, “How bees analyse the polarization patterns in the sky - Experiments and model,” J. Comp. Physiol. A 154(5), 607–615 (1984).
    [Crossref]
  8. S. M. Reppert, H. Zhu, and R. H. White, “Polarized Light Helps Monarch Butterflies Navigate,” Curr. Biol. 14(2), 155–158 (2004).
    [Crossref]
  9. M. Dacke, E. Baird, M. Byrne, C. H. Scholtz, and E. J. Warrant, “Dung beetles use the milky way for orientation,” Curr. Biol. 23(4), 298–300 (2013).
    [Crossref]
  10. K. Coulson, Polarization and Intensity of Light in the Atmosphere (A. Deepak Pub., 1988).
  11. A. J. Brown, “Spectral bluing induced by small particles under the Mie and Rayleigh regimes,” Icarus 239, 85–95 (2014).
    [Crossref]
  12. A. J. Brown, T. I. Michaels, S. Byrne, W. Sun, T. N. Titus, A. Colaprete, M. J. Wolff, G. Videen, and C. J. Grund, “The case for a modern multiwavelength, polarization-sensitive LIDAR in orbit around Mars,” J. Quant. Spectrosc. Radiat. Transf. 153, 131–143 (2015).
    [Crossref]
  13. B. Suhai and G. Horváth, “How well does the Rayleigh model describe the E-vector distribution of skylight in clear and cloudy conditions? A full-sky polarimetric study,” J. Opt. Soc. Am. A 21(9), 1669–1676 (2004).
    [Crossref]
  14. D. Wang, H. Liang, H. Zhu, and S. Zhang, “A bionic camera-based polarization navigation sensor,” Sensors 14(7), 13006–13023 (2014).
    [Crossref]
  15. D. Lambrinos, R. Möller, T. Labhart, R. Pfeifer, and R. Wehner, “Mobile robot employing insect strategies for navigation,” Rob. Auton. Syst. 30(1-2), 39–64 (2000).
    [Crossref]
  16. C. Jinkui, Z. Kaichun, Z. Qiang, and W. Tichang, “Construction and performance test of a novel polarization sensor for navigation,” Sensors Actuators A Phys. 148(1), 75–82 (2008).
    [Crossref]
  17. J. Chahl and A. Mizutani, “Biomimetic attitude and orientation sensors,” IEEE Sens. J. 12(2), 289–297 (2012).
    [Crossref]
  18. J. Tang, N. Zhang, D. Li, F. Wang, B. Zhang, C. Wang, C. Shen, J. Ren, C. Xue, and J. Liu, “Novel robust skylight compass method based on full-sky polarization imaging under harsh conditions,” Opt. Express 24(14), 15834–15844 (2016).
    [Crossref]
  19. C. Fan, X. Hu, X. He, L. Zhang, and J. Lian, “Integrated Polarized Skylight Sensor and MIMU with a Metric Map for Urban Ground Navigation,” IEEE Sens. J. 18(4), 1714–1722 (2018).
    [Crossref]
  20. H. Lu, K. Zhao, Z. You, and K. Huang, “Angle algorithm based on Hough transform for imaging polarization navigation sensor,” Opt. Express 23(6), 7248–7262 (2015).
    [Crossref]
  21. H. Lu, K. Zhao, Z. You, and K. Huang, “Real-time polarization imaging algorithm for camera-based polarization navigation sensors,” Appl. Opt. 56(11), 3199–3205 (2017).
    [Crossref]
  22. Y. Wang, X. Hu, J. Lian, L. Zhang, Z. Xian, and T. Ma, “Design of a device for sky light polarization measurements,” Sensors 14(8), 14916–14931 (2014).
    [Crossref]
  23. L. Guan, S. Li, L. Zhai, S. Liu, H. Liu, W. Lin, Y. Cui, J. Chu, and H. Xie, “Study on skylight polarization patterns over the ocean for polarized light navigation application,” Appl. Opt. 57(21), 6243–6251 (2018).
    [Crossref]
  24. Y. Zhang, H. Zhao, P. Song, S. Shi, W. Xu, and X. Liang, “Ground-based full-sky imaging polarimeter based on liquid crystal variable retarders,” Opt. Express 22(7), 8749–8764 (2014).
    [Crossref]
  25. Y. Zhang, H. Zhao, and N. Li, “Polarization calibration with large apertures in full field of view for a full Stokes imaging polarimeter based on liquid-crystal variable retarders,” Appl. Opt. 52(6), 1284–1292 (2013).
    [Crossref]
  26. H. Zhao, W. Xu, Y. Zhang, X. Li, H. Zhang, J. Xuan, and B. Jia, “Polarization patterns under different sky conditions and a navigation method based on the symmetry of the AOP map of skylight,” Opt. Express 26(22), 28589–28603 (2018).
    [Crossref]
  27. M. Hamaoui, “Polarized skylight navigation,” Appl. Opt. 56(3), B37–B46 (2017).
    [Crossref]
  28. H. Liang, H. Bai, N. Liu, and X. Sui, “Polarized skylight compass based on a soft-margin support vector machine working in cloudy conditions,” Appl. Opt. 59(5), 1271–1279 (2020).
    [Crossref]
  29. W. McMaster, “Polarization and the Stokes Parameters,” Am. J. Phys. 22(6), 351–362 (1954).
    [Crossref]
  30. Q. Li, Y. Hu, S. Zhang, J. Cao, and Q. Hao, “Calibration and image processing method for polarized skylight sensor,” Proc. SPIE 115500F1 (2020).
    [Crossref]
  31. A. J. Brown, “Equivalence relations and symmetries for laboratory, LIDAR, and planetary Müeller matrix scattering geometries,” J. Opt. Soc. Am. A 31(12), 2789–2794 (2014).
    [Crossref]
  32. G. Horváth, A. Barta, J. Gál, B. Suhai, and O. Haiman, “Ground-based full-sky imaging polarimetry of rapidly changing skies and its use for polarimetric cloud detection,” Appl. Opt. 41(3), 543–559 (2002).
    [Crossref]
  33. N. J. Pust and J. A. Shaw, “Digital all-sky polarization imaging of partly cloudy skies,” Appl. Opt. 47(34), H190–H198 (2008).
    [Crossref]
  34. K. N. Liou and C. Bohren, “An Introduction to Atmospheric Radiation,” Phys. Today 34(7), 66–67 (1981).
    [Crossref]
  35. E. McCartney and F. Hall, “Optics of the Atmosphere: Scattering by Molecules and Particles,” Phys. Today 30(5), 76–77 (1977).
    [Crossref]
  36. S. Sun, J. Gao, D. Wang, T. Yang, and X. Wang, “Improved models of imaging of skylight polarization through a fisheye lens,” Sensors 19(22), 4844 (2019).
    [Crossref]
  37. S. Park and R. Schowengerdt, “IMAGE-RECONSTRUCTION BY PARAMETRIC CUBIC CONVOLUTION,” Comput. Vision Graph. Image Process. 23(3), 258–272 (1983).
    [Crossref]
  38. R. Duda and P. Hart, “Use of the Hough Transformation To Detect Lines and Curves in Pictures,” Commun. ACM 15(1), 11–15 (1972).
    [Crossref]

2020 (2)

H. Liang, H. Bai, N. Liu, and X. Sui, “Polarized skylight compass based on a soft-margin support vector machine working in cloudy conditions,” Appl. Opt. 59(5), 1271–1279 (2020).
[Crossref]

Q. Li, Y. Hu, S. Zhang, J. Cao, and Q. Hao, “Calibration and image processing method for polarized skylight sensor,” Proc. SPIE 115500F1 (2020).
[Crossref]

2019 (1)

S. Sun, J. Gao, D. Wang, T. Yang, and X. Wang, “Improved models of imaging of skylight polarization through a fisheye lens,” Sensors 19(22), 4844 (2019).
[Crossref]

2018 (3)

2017 (2)

2016 (2)

2015 (2)

A. J. Brown, T. I. Michaels, S. Byrne, W. Sun, T. N. Titus, A. Colaprete, M. J. Wolff, G. Videen, and C. J. Grund, “The case for a modern multiwavelength, polarization-sensitive LIDAR in orbit around Mars,” J. Quant. Spectrosc. Radiat. Transf. 153, 131–143 (2015).
[Crossref]

H. Lu, K. Zhao, Z. You, and K. Huang, “Angle algorithm based on Hough transform for imaging polarization navigation sensor,” Opt. Express 23(6), 7248–7262 (2015).
[Crossref]

2014 (5)

Y. Wang, X. Hu, J. Lian, L. Zhang, Z. Xian, and T. Ma, “Design of a device for sky light polarization measurements,” Sensors 14(8), 14916–14931 (2014).
[Crossref]

Y. Zhang, H. Zhao, P. Song, S. Shi, W. Xu, and X. Liang, “Ground-based full-sky imaging polarimeter based on liquid crystal variable retarders,” Opt. Express 22(7), 8749–8764 (2014).
[Crossref]

A. J. Brown, “Equivalence relations and symmetries for laboratory, LIDAR, and planetary Müeller matrix scattering geometries,” J. Opt. Soc. Am. A 31(12), 2789–2794 (2014).
[Crossref]

D. Wang, H. Liang, H. Zhu, and S. Zhang, “A bionic camera-based polarization navigation sensor,” Sensors 14(7), 13006–13023 (2014).
[Crossref]

A. J. Brown, “Spectral bluing induced by small particles under the Mie and Rayleigh regimes,” Icarus 239, 85–95 (2014).
[Crossref]

2013 (2)

2012 (1)

J. Chahl and A. Mizutani, “Biomimetic attitude and orientation sensors,” IEEE Sens. J. 12(2), 289–297 (2012).
[Crossref]

2011 (1)

K. Tin Leung, J. F. Whidborne, D. Purdy, and A. Dunoyer, “A review of ground vehicle dynamic state estimations utilising GPS/INS,” Veh. Syst. Dyn. 49(1-2), 29–58 (2011).
[Crossref]

2009 (1)

I. Skog and P. Händel, “In-car positioning and navigation technologiesa survey,” IEEE Trans. Intell. Transp. Syst. 10(1), 4–21 (2009).
[Crossref]

2008 (2)

C. Jinkui, Z. Kaichun, Z. Qiang, and W. Tichang, “Construction and performance test of a novel polarization sensor for navigation,” Sensors Actuators A Phys. 148(1), 75–82 (2008).
[Crossref]

N. J. Pust and J. A. Shaw, “Digital all-sky polarization imaging of partly cloudy skies,” Appl. Opt. 47(34), H190–H198 (2008).
[Crossref]

2004 (2)

2002 (2)

2000 (1)

D. Lambrinos, R. Möller, T. Labhart, R. Pfeifer, and R. Wehner, “Mobile robot employing insect strategies for navigation,” Rob. Auton. Syst. 30(1-2), 39–64 (2000).
[Crossref]

1997 (1)

D. Lambrinos, H. Kobayashi, R. Pfeifer, M. Maris, T. Labhart, and R. Wehner, “An autonomous agent navigating with a polarized light compass,” Adapt. Behav. 6(1), 131–161 (1997).
[Crossref]

1988 (1)

M. Muller and R. Wehner, “Path integration in desert ants, Cataglyphis fortis,” Proc. Natl. Acad. Sci. 85(14), 5287–5290 (1988).
[Crossref]

1984 (1)

S. Rossel and R. Wehner, “How bees analyse the polarization patterns in the sky - Experiments and model,” J. Comp. Physiol. A 154(5), 607–615 (1984).
[Crossref]

1983 (1)

S. Park and R. Schowengerdt, “IMAGE-RECONSTRUCTION BY PARAMETRIC CUBIC CONVOLUTION,” Comput. Vision Graph. Image Process. 23(3), 258–272 (1983).
[Crossref]

1981 (1)

K. N. Liou and C. Bohren, “An Introduction to Atmospheric Radiation,” Phys. Today 34(7), 66–67 (1981).
[Crossref]

1977 (1)

E. McCartney and F. Hall, “Optics of the Atmosphere: Scattering by Molecules and Particles,” Phys. Today 30(5), 76–77 (1977).
[Crossref]

1972 (1)

R. Duda and P. Hart, “Use of the Hough Transformation To Detect Lines and Curves in Pictures,” Commun. ACM 15(1), 11–15 (1972).
[Crossref]

1954 (1)

W. McMaster, “Polarization and the Stokes Parameters,” Am. J. Phys. 22(6), 351–362 (1954).
[Crossref]

Bai, H.

Baird, E.

M. Dacke, E. Baird, M. Byrne, C. H. Scholtz, and E. J. Warrant, “Dung beetles use the milky way for orientation,” Curr. Biol. 23(4), 298–300 (2013).
[Crossref]

Barta, A.

Bohren, C.

K. N. Liou and C. Bohren, “An Introduction to Atmospheric Radiation,” Phys. Today 34(7), 66–67 (1981).
[Crossref]

Brown, A. J.

A. J. Brown, T. I. Michaels, S. Byrne, W. Sun, T. N. Titus, A. Colaprete, M. J. Wolff, G. Videen, and C. J. Grund, “The case for a modern multiwavelength, polarization-sensitive LIDAR in orbit around Mars,” J. Quant. Spectrosc. Radiat. Transf. 153, 131–143 (2015).
[Crossref]

A. J. Brown, “Spectral bluing induced by small particles under the Mie and Rayleigh regimes,” Icarus 239, 85–95 (2014).
[Crossref]

A. J. Brown, “Equivalence relations and symmetries for laboratory, LIDAR, and planetary Müeller matrix scattering geometries,” J. Opt. Soc. Am. A 31(12), 2789–2794 (2014).
[Crossref]

Byrne, M.

M. Dacke, E. Baird, M. Byrne, C. H. Scholtz, and E. J. Warrant, “Dung beetles use the milky way for orientation,” Curr. Biol. 23(4), 298–300 (2013).
[Crossref]

Byrne, S.

A. J. Brown, T. I. Michaels, S. Byrne, W. Sun, T. N. Titus, A. Colaprete, M. J. Wolff, G. Videen, and C. J. Grund, “The case for a modern multiwavelength, polarization-sensitive LIDAR in orbit around Mars,” J. Quant. Spectrosc. Radiat. Transf. 153, 131–143 (2015).
[Crossref]

Cao, J.

Q. Li, Y. Hu, S. Zhang, J. Cao, and Q. Hao, “Calibration and image processing method for polarized skylight sensor,” Proc. SPIE 115500F1 (2020).
[Crossref]

Chahl, J.

J. Chahl and A. Mizutani, “Biomimetic attitude and orientation sensors,” IEEE Sens. J. 12(2), 289–297 (2012).
[Crossref]

Chu, J.

Colaprete, A.

A. J. Brown, T. I. Michaels, S. Byrne, W. Sun, T. N. Titus, A. Colaprete, M. J. Wolff, G. Videen, and C. J. Grund, “The case for a modern multiwavelength, polarization-sensitive LIDAR in orbit around Mars,” J. Quant. Spectrosc. Radiat. Transf. 153, 131–143 (2015).
[Crossref]

Coulson, K.

K. Coulson, Polarization and Intensity of Light in the Atmosphere (A. Deepak Pub., 1988).

Cui, Y.

Dacke, M.

M. Dacke, E. Baird, M. Byrne, C. H. Scholtz, and E. J. Warrant, “Dung beetles use the milky way for orientation,” Curr. Biol. 23(4), 298–300 (2013).
[Crossref]

DeSouza, G. N.

G. N. DeSouza and A. C. Kak, “Vision for mobile robot navigation: A survey,” IEEE Trans. Pattern Anal. Mach. Intell. 24(2), 237–267 (2002).
[Crossref]

Duda, R.

R. Duda and P. Hart, “Use of the Hough Transformation To Detect Lines and Curves in Pictures,” Commun. ACM 15(1), 11–15 (1972).
[Crossref]

Dunoyer, A.

K. Tin Leung, J. F. Whidborne, D. Purdy, and A. Dunoyer, “A review of ground vehicle dynamic state estimations utilising GPS/INS,” Veh. Syst. Dyn. 49(1-2), 29–58 (2011).
[Crossref]

Fan, C.

C. Fan, X. Hu, X. He, L. Zhang, and J. Lian, “Integrated Polarized Skylight Sensor and MIMU with a Metric Map for Urban Ground Navigation,” IEEE Sens. J. 18(4), 1714–1722 (2018).
[Crossref]

Gál, J.

Gao, J.

S. Sun, J. Gao, D. Wang, T. Yang, and X. Wang, “Improved models of imaging of skylight polarization through a fisheye lens,” Sensors 19(22), 4844 (2019).
[Crossref]

Grund, C. J.

A. J. Brown, T. I. Michaels, S. Byrne, W. Sun, T. N. Titus, A. Colaprete, M. J. Wolff, G. Videen, and C. J. Grund, “The case for a modern multiwavelength, polarization-sensitive LIDAR in orbit around Mars,” J. Quant. Spectrosc. Radiat. Transf. 153, 131–143 (2015).
[Crossref]

Guan, L.

Haiman, O.

Hall, F.

E. McCartney and F. Hall, “Optics of the Atmosphere: Scattering by Molecules and Particles,” Phys. Today 30(5), 76–77 (1977).
[Crossref]

Hamaoui, M.

Händel, P.

I. Skog and P. Händel, “In-car positioning and navigation technologiesa survey,” IEEE Trans. Intell. Transp. Syst. 10(1), 4–21 (2009).
[Crossref]

Hao, Q.

Q. Li, Y. Hu, S. Zhang, J. Cao, and Q. Hao, “Calibration and image processing method for polarized skylight sensor,” Proc. SPIE 115500F1 (2020).
[Crossref]

Hart, P.

R. Duda and P. Hart, “Use of the Hough Transformation To Detect Lines and Curves in Pictures,” Commun. ACM 15(1), 11–15 (1972).
[Crossref]

He, X.

C. Fan, X. Hu, X. He, L. Zhang, and J. Lian, “Integrated Polarized Skylight Sensor and MIMU with a Metric Map for Urban Ground Navigation,” IEEE Sens. J. 18(4), 1714–1722 (2018).
[Crossref]

Horváth, G.

Hu, X.

C. Fan, X. Hu, X. He, L. Zhang, and J. Lian, “Integrated Polarized Skylight Sensor and MIMU with a Metric Map for Urban Ground Navigation,” IEEE Sens. J. 18(4), 1714–1722 (2018).
[Crossref]

Y. Wang, X. Hu, J. Lian, L. Zhang, Z. Xian, and T. Ma, “Design of a device for sky light polarization measurements,” Sensors 14(8), 14916–14931 (2014).
[Crossref]

Hu, Y.

Q. Li, Y. Hu, S. Zhang, J. Cao, and Q. Hao, “Calibration and image processing method for polarized skylight sensor,” Proc. SPIE 115500F1 (2020).
[Crossref]

Huang, K.

Jia, B.

Jinkui, C.

C. Jinkui, Z. Kaichun, Z. Qiang, and W. Tichang, “Construction and performance test of a novel polarization sensor for navigation,” Sensors Actuators A Phys. 148(1), 75–82 (2008).
[Crossref]

Kaichun, Z.

C. Jinkui, Z. Kaichun, Z. Qiang, and W. Tichang, “Construction and performance test of a novel polarization sensor for navigation,” Sensors Actuators A Phys. 148(1), 75–82 (2008).
[Crossref]

Kak, A. C.

G. N. DeSouza and A. C. Kak, “Vision for mobile robot navigation: A survey,” IEEE Trans. Pattern Anal. Mach. Intell. 24(2), 237–267 (2002).
[Crossref]

Kobayashi, H.

D. Lambrinos, H. Kobayashi, R. Pfeifer, M. Maris, T. Labhart, and R. Wehner, “An autonomous agent navigating with a polarized light compass,” Adapt. Behav. 6(1), 131–161 (1997).
[Crossref]

Labhart, T.

D. Lambrinos, R. Möller, T. Labhart, R. Pfeifer, and R. Wehner, “Mobile robot employing insect strategies for navigation,” Rob. Auton. Syst. 30(1-2), 39–64 (2000).
[Crossref]

D. Lambrinos, H. Kobayashi, R. Pfeifer, M. Maris, T. Labhart, and R. Wehner, “An autonomous agent navigating with a polarized light compass,” Adapt. Behav. 6(1), 131–161 (1997).
[Crossref]

Lambrinos, D.

D. Lambrinos, R. Möller, T. Labhart, R. Pfeifer, and R. Wehner, “Mobile robot employing insect strategies for navigation,” Rob. Auton. Syst. 30(1-2), 39–64 (2000).
[Crossref]

D. Lambrinos, H. Kobayashi, R. Pfeifer, M. Maris, T. Labhart, and R. Wehner, “An autonomous agent navigating with a polarized light compass,” Adapt. Behav. 6(1), 131–161 (1997).
[Crossref]

Li, D.

Li, N.

Li, Q.

Q. Li, Y. Hu, S. Zhang, J. Cao, and Q. Hao, “Calibration and image processing method for polarized skylight sensor,” Proc. SPIE 115500F1 (2020).
[Crossref]

Li, S.

Li, X.

Lian, J.

C. Fan, X. Hu, X. He, L. Zhang, and J. Lian, “Integrated Polarized Skylight Sensor and MIMU with a Metric Map for Urban Ground Navigation,” IEEE Sens. J. 18(4), 1714–1722 (2018).
[Crossref]

Y. Wang, X. Hu, J. Lian, L. Zhang, Z. Xian, and T. Ma, “Design of a device for sky light polarization measurements,” Sensors 14(8), 14916–14931 (2014).
[Crossref]

Liang, H.

H. Liang, H. Bai, N. Liu, and X. Sui, “Polarized skylight compass based on a soft-margin support vector machine working in cloudy conditions,” Appl. Opt. 59(5), 1271–1279 (2020).
[Crossref]

D. Wang, H. Liang, H. Zhu, and S. Zhang, “A bionic camera-based polarization navigation sensor,” Sensors 14(7), 13006–13023 (2014).
[Crossref]

Liang, X.

Lin, W.

Liou, K. N.

K. N. Liou and C. Bohren, “An Introduction to Atmospheric Radiation,” Phys. Today 34(7), 66–67 (1981).
[Crossref]

Liu, H.

Liu, J.

Liu, N.

Liu, S.

Lu, H.

Luo, S.

Y. Wu and S. Luo, “On misalignment between magnetometer and inertial sensors,” IEEE Sens. J. 16(16), 6288–6297 (2016).
[Crossref]

Ma, T.

Y. Wang, X. Hu, J. Lian, L. Zhang, Z. Xian, and T. Ma, “Design of a device for sky light polarization measurements,” Sensors 14(8), 14916–14931 (2014).
[Crossref]

Maris, M.

D. Lambrinos, H. Kobayashi, R. Pfeifer, M. Maris, T. Labhart, and R. Wehner, “An autonomous agent navigating with a polarized light compass,” Adapt. Behav. 6(1), 131–161 (1997).
[Crossref]

McCartney, E.

E. McCartney and F. Hall, “Optics of the Atmosphere: Scattering by Molecules and Particles,” Phys. Today 30(5), 76–77 (1977).
[Crossref]

McMaster, W.

W. McMaster, “Polarization and the Stokes Parameters,” Am. J. Phys. 22(6), 351–362 (1954).
[Crossref]

Michaels, T. I.

A. J. Brown, T. I. Michaels, S. Byrne, W. Sun, T. N. Titus, A. Colaprete, M. J. Wolff, G. Videen, and C. J. Grund, “The case for a modern multiwavelength, polarization-sensitive LIDAR in orbit around Mars,” J. Quant. Spectrosc. Radiat. Transf. 153, 131–143 (2015).
[Crossref]

Mizutani, A.

J. Chahl and A. Mizutani, “Biomimetic attitude and orientation sensors,” IEEE Sens. J. 12(2), 289–297 (2012).
[Crossref]

Möller, R.

D. Lambrinos, R. Möller, T. Labhart, R. Pfeifer, and R. Wehner, “Mobile robot employing insect strategies for navigation,” Rob. Auton. Syst. 30(1-2), 39–64 (2000).
[Crossref]

Muller, M.

M. Muller and R. Wehner, “Path integration in desert ants, Cataglyphis fortis,” Proc. Natl. Acad. Sci. 85(14), 5287–5290 (1988).
[Crossref]

Park, S.

S. Park and R. Schowengerdt, “IMAGE-RECONSTRUCTION BY PARAMETRIC CUBIC CONVOLUTION,” Comput. Vision Graph. Image Process. 23(3), 258–272 (1983).
[Crossref]

Pfeifer, R.

D. Lambrinos, R. Möller, T. Labhart, R. Pfeifer, and R. Wehner, “Mobile robot employing insect strategies for navigation,” Rob. Auton. Syst. 30(1-2), 39–64 (2000).
[Crossref]

D. Lambrinos, H. Kobayashi, R. Pfeifer, M. Maris, T. Labhart, and R. Wehner, “An autonomous agent navigating with a polarized light compass,” Adapt. Behav. 6(1), 131–161 (1997).
[Crossref]

Purdy, D.

K. Tin Leung, J. F. Whidborne, D. Purdy, and A. Dunoyer, “A review of ground vehicle dynamic state estimations utilising GPS/INS,” Veh. Syst. Dyn. 49(1-2), 29–58 (2011).
[Crossref]

Pust, N. J.

Qiang, Z.

C. Jinkui, Z. Kaichun, Z. Qiang, and W. Tichang, “Construction and performance test of a novel polarization sensor for navigation,” Sensors Actuators A Phys. 148(1), 75–82 (2008).
[Crossref]

Ren, J.

Reppert, S. M.

S. M. Reppert, H. Zhu, and R. H. White, “Polarized Light Helps Monarch Butterflies Navigate,” Curr. Biol. 14(2), 155–158 (2004).
[Crossref]

Rossel, S.

S. Rossel and R. Wehner, “How bees analyse the polarization patterns in the sky - Experiments and model,” J. Comp. Physiol. A 154(5), 607–615 (1984).
[Crossref]

Scholtz, C. H.

M. Dacke, E. Baird, M. Byrne, C. H. Scholtz, and E. J. Warrant, “Dung beetles use the milky way for orientation,” Curr. Biol. 23(4), 298–300 (2013).
[Crossref]

Schowengerdt, R.

S. Park and R. Schowengerdt, “IMAGE-RECONSTRUCTION BY PARAMETRIC CUBIC CONVOLUTION,” Comput. Vision Graph. Image Process. 23(3), 258–272 (1983).
[Crossref]

Shaw, J. A.

Shen, C.

Shi, S.

Skog, I.

I. Skog and P. Händel, “In-car positioning and navigation technologiesa survey,” IEEE Trans. Intell. Transp. Syst. 10(1), 4–21 (2009).
[Crossref]

Song, P.

Suhai, B.

Sui, X.

Sun, S.

S. Sun, J. Gao, D. Wang, T. Yang, and X. Wang, “Improved models of imaging of skylight polarization through a fisheye lens,” Sensors 19(22), 4844 (2019).
[Crossref]

Sun, W.

A. J. Brown, T. I. Michaels, S. Byrne, W. Sun, T. N. Titus, A. Colaprete, M. J. Wolff, G. Videen, and C. J. Grund, “The case for a modern multiwavelength, polarization-sensitive LIDAR in orbit around Mars,” J. Quant. Spectrosc. Radiat. Transf. 153, 131–143 (2015).
[Crossref]

Tang, J.

Tichang, W.

C. Jinkui, Z. Kaichun, Z. Qiang, and W. Tichang, “Construction and performance test of a novel polarization sensor for navigation,” Sensors Actuators A Phys. 148(1), 75–82 (2008).
[Crossref]

Tin Leung, K.

K. Tin Leung, J. F. Whidborne, D. Purdy, and A. Dunoyer, “A review of ground vehicle dynamic state estimations utilising GPS/INS,” Veh. Syst. Dyn. 49(1-2), 29–58 (2011).
[Crossref]

Titus, T. N.

A. J. Brown, T. I. Michaels, S. Byrne, W. Sun, T. N. Titus, A. Colaprete, M. J. Wolff, G. Videen, and C. J. Grund, “The case for a modern multiwavelength, polarization-sensitive LIDAR in orbit around Mars,” J. Quant. Spectrosc. Radiat. Transf. 153, 131–143 (2015).
[Crossref]

Videen, G.

A. J. Brown, T. I. Michaels, S. Byrne, W. Sun, T. N. Titus, A. Colaprete, M. J. Wolff, G. Videen, and C. J. Grund, “The case for a modern multiwavelength, polarization-sensitive LIDAR in orbit around Mars,” J. Quant. Spectrosc. Radiat. Transf. 153, 131–143 (2015).
[Crossref]

Wang, C.

Wang, D.

S. Sun, J. Gao, D. Wang, T. Yang, and X. Wang, “Improved models of imaging of skylight polarization through a fisheye lens,” Sensors 19(22), 4844 (2019).
[Crossref]

D. Wang, H. Liang, H. Zhu, and S. Zhang, “A bionic camera-based polarization navigation sensor,” Sensors 14(7), 13006–13023 (2014).
[Crossref]

Wang, F.

Wang, X.

S. Sun, J. Gao, D. Wang, T. Yang, and X. Wang, “Improved models of imaging of skylight polarization through a fisheye lens,” Sensors 19(22), 4844 (2019).
[Crossref]

Wang, Y.

Y. Wang, X. Hu, J. Lian, L. Zhang, Z. Xian, and T. Ma, “Design of a device for sky light polarization measurements,” Sensors 14(8), 14916–14931 (2014).
[Crossref]

Warrant, E. J.

M. Dacke, E. Baird, M. Byrne, C. H. Scholtz, and E. J. Warrant, “Dung beetles use the milky way for orientation,” Curr. Biol. 23(4), 298–300 (2013).
[Crossref]

Wehner, R.

D. Lambrinos, R. Möller, T. Labhart, R. Pfeifer, and R. Wehner, “Mobile robot employing insect strategies for navigation,” Rob. Auton. Syst. 30(1-2), 39–64 (2000).
[Crossref]

D. Lambrinos, H. Kobayashi, R. Pfeifer, M. Maris, T. Labhart, and R. Wehner, “An autonomous agent navigating with a polarized light compass,” Adapt. Behav. 6(1), 131–161 (1997).
[Crossref]

M. Muller and R. Wehner, “Path integration in desert ants, Cataglyphis fortis,” Proc. Natl. Acad. Sci. 85(14), 5287–5290 (1988).
[Crossref]

S. Rossel and R. Wehner, “How bees analyse the polarization patterns in the sky - Experiments and model,” J. Comp. Physiol. A 154(5), 607–615 (1984).
[Crossref]

Whidborne, J. F.

K. Tin Leung, J. F. Whidborne, D. Purdy, and A. Dunoyer, “A review of ground vehicle dynamic state estimations utilising GPS/INS,” Veh. Syst. Dyn. 49(1-2), 29–58 (2011).
[Crossref]

White, R. H.

S. M. Reppert, H. Zhu, and R. H. White, “Polarized Light Helps Monarch Butterflies Navigate,” Curr. Biol. 14(2), 155–158 (2004).
[Crossref]

Wolff, M. J.

A. J. Brown, T. I. Michaels, S. Byrne, W. Sun, T. N. Titus, A. Colaprete, M. J. Wolff, G. Videen, and C. J. Grund, “The case for a modern multiwavelength, polarization-sensitive LIDAR in orbit around Mars,” J. Quant. Spectrosc. Radiat. Transf. 153, 131–143 (2015).
[Crossref]

Wu, Y.

Y. Wu and S. Luo, “On misalignment between magnetometer and inertial sensors,” IEEE Sens. J. 16(16), 6288–6297 (2016).
[Crossref]

Xian, Z.

Y. Wang, X. Hu, J. Lian, L. Zhang, Z. Xian, and T. Ma, “Design of a device for sky light polarization measurements,” Sensors 14(8), 14916–14931 (2014).
[Crossref]

Xie, H.

Xu, W.

Xuan, J.

Xue, C.

Yang, T.

S. Sun, J. Gao, D. Wang, T. Yang, and X. Wang, “Improved models of imaging of skylight polarization through a fisheye lens,” Sensors 19(22), 4844 (2019).
[Crossref]

You, Z.

Zhai, L.

Zhang, B.

Zhang, H.

Zhang, L.

C. Fan, X. Hu, X. He, L. Zhang, and J. Lian, “Integrated Polarized Skylight Sensor and MIMU with a Metric Map for Urban Ground Navigation,” IEEE Sens. J. 18(4), 1714–1722 (2018).
[Crossref]

Y. Wang, X. Hu, J. Lian, L. Zhang, Z. Xian, and T. Ma, “Design of a device for sky light polarization measurements,” Sensors 14(8), 14916–14931 (2014).
[Crossref]

Zhang, N.

Zhang, S.

Q. Li, Y. Hu, S. Zhang, J. Cao, and Q. Hao, “Calibration and image processing method for polarized skylight sensor,” Proc. SPIE 115500F1 (2020).
[Crossref]

D. Wang, H. Liang, H. Zhu, and S. Zhang, “A bionic camera-based polarization navigation sensor,” Sensors 14(7), 13006–13023 (2014).
[Crossref]

Zhang, Y.

Zhao, H.

Zhao, K.

Zhu, H.

D. Wang, H. Liang, H. Zhu, and S. Zhang, “A bionic camera-based polarization navigation sensor,” Sensors 14(7), 13006–13023 (2014).
[Crossref]

S. M. Reppert, H. Zhu, and R. H. White, “Polarized Light Helps Monarch Butterflies Navigate,” Curr. Biol. 14(2), 155–158 (2004).
[Crossref]

Adapt. Behav. (1)

D. Lambrinos, H. Kobayashi, R. Pfeifer, M. Maris, T. Labhart, and R. Wehner, “An autonomous agent navigating with a polarized light compass,” Adapt. Behav. 6(1), 131–161 (1997).
[Crossref]

Am. J. Phys. (1)

W. McMaster, “Polarization and the Stokes Parameters,” Am. J. Phys. 22(6), 351–362 (1954).
[Crossref]

Appl. Opt. (7)

Commun. ACM (1)

R. Duda and P. Hart, “Use of the Hough Transformation To Detect Lines and Curves in Pictures,” Commun. ACM 15(1), 11–15 (1972).
[Crossref]

Comput. Vision Graph. Image Process. (1)

S. Park and R. Schowengerdt, “IMAGE-RECONSTRUCTION BY PARAMETRIC CUBIC CONVOLUTION,” Comput. Vision Graph. Image Process. 23(3), 258–272 (1983).
[Crossref]

Curr. Biol. (2)

S. M. Reppert, H. Zhu, and R. H. White, “Polarized Light Helps Monarch Butterflies Navigate,” Curr. Biol. 14(2), 155–158 (2004).
[Crossref]

M. Dacke, E. Baird, M. Byrne, C. H. Scholtz, and E. J. Warrant, “Dung beetles use the milky way for orientation,” Curr. Biol. 23(4), 298–300 (2013).
[Crossref]

Icarus (1)

A. J. Brown, “Spectral bluing induced by small particles under the Mie and Rayleigh regimes,” Icarus 239, 85–95 (2014).
[Crossref]

IEEE Sens. J. (3)

J. Chahl and A. Mizutani, “Biomimetic attitude and orientation sensors,” IEEE Sens. J. 12(2), 289–297 (2012).
[Crossref]

Y. Wu and S. Luo, “On misalignment between magnetometer and inertial sensors,” IEEE Sens. J. 16(16), 6288–6297 (2016).
[Crossref]

C. Fan, X. Hu, X. He, L. Zhang, and J. Lian, “Integrated Polarized Skylight Sensor and MIMU with a Metric Map for Urban Ground Navigation,” IEEE Sens. J. 18(4), 1714–1722 (2018).
[Crossref]

IEEE Trans. Intell. Transp. Syst. (1)

I. Skog and P. Händel, “In-car positioning and navigation technologiesa survey,” IEEE Trans. Intell. Transp. Syst. 10(1), 4–21 (2009).
[Crossref]

IEEE Trans. Pattern Anal. Mach. Intell. (1)

G. N. DeSouza and A. C. Kak, “Vision for mobile robot navigation: A survey,” IEEE Trans. Pattern Anal. Mach. Intell. 24(2), 237–267 (2002).
[Crossref]

J. Comp. Physiol. A (1)

S. Rossel and R. Wehner, “How bees analyse the polarization patterns in the sky - Experiments and model,” J. Comp. Physiol. A 154(5), 607–615 (1984).
[Crossref]

J. Opt. Soc. Am. A (2)

J. Quant. Spectrosc. Radiat. Transf. (1)

A. J. Brown, T. I. Michaels, S. Byrne, W. Sun, T. N. Titus, A. Colaprete, M. J. Wolff, G. Videen, and C. J. Grund, “The case for a modern multiwavelength, polarization-sensitive LIDAR in orbit around Mars,” J. Quant. Spectrosc. Radiat. Transf. 153, 131–143 (2015).
[Crossref]

Opt. Express (4)

Phys. Today (2)

K. N. Liou and C. Bohren, “An Introduction to Atmospheric Radiation,” Phys. Today 34(7), 66–67 (1981).
[Crossref]

E. McCartney and F. Hall, “Optics of the Atmosphere: Scattering by Molecules and Particles,” Phys. Today 30(5), 76–77 (1977).
[Crossref]

Proc. Natl. Acad. Sci. (1)

M. Muller and R. Wehner, “Path integration in desert ants, Cataglyphis fortis,” Proc. Natl. Acad. Sci. 85(14), 5287–5290 (1988).
[Crossref]

Proc. SPIE (1)

Q. Li, Y. Hu, S. Zhang, J. Cao, and Q. Hao, “Calibration and image processing method for polarized skylight sensor,” Proc. SPIE 115500F1 (2020).
[Crossref]

Rob. Auton. Syst. (1)

D. Lambrinos, R. Möller, T. Labhart, R. Pfeifer, and R. Wehner, “Mobile robot employing insect strategies for navigation,” Rob. Auton. Syst. 30(1-2), 39–64 (2000).
[Crossref]

Sensors (3)

D. Wang, H. Liang, H. Zhu, and S. Zhang, “A bionic camera-based polarization navigation sensor,” Sensors 14(7), 13006–13023 (2014).
[Crossref]

S. Sun, J. Gao, D. Wang, T. Yang, and X. Wang, “Improved models of imaging of skylight polarization through a fisheye lens,” Sensors 19(22), 4844 (2019).
[Crossref]

Y. Wang, X. Hu, J. Lian, L. Zhang, Z. Xian, and T. Ma, “Design of a device for sky light polarization measurements,” Sensors 14(8), 14916–14931 (2014).
[Crossref]

Sensors Actuators A Phys. (1)

C. Jinkui, Z. Kaichun, Z. Qiang, and W. Tichang, “Construction and performance test of a novel polarization sensor for navigation,” Sensors Actuators A Phys. 148(1), 75–82 (2008).
[Crossref]

Veh. Syst. Dyn. (1)

K. Tin Leung, J. F. Whidborne, D. Purdy, and A. Dunoyer, “A review of ground vehicle dynamic state estimations utilising GPS/INS,” Veh. Syst. Dyn. 49(1-2), 29–58 (2011).
[Crossref]

Other (1)

K. Coulson, Polarization and Intensity of Light in the Atmosphere (A. Deepak Pub., 1988).

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (13)

Fig. 1.
Fig. 1. Ideal model of the skylight polarization pattern in the sky. (a) The coordinate system, where O represents the observation position of the carrier, (b) 3D AOP distribution, (c) 3D DOP distribution, (d) AOP distribution observed by the carrier, and (e) DOP distribution observed by the carrier.
Fig. 2.
Fig. 2. Flowchart of the proposed full-sky polarization imaging navigation method.
Fig. 3.
Fig. 3. (a) Typical diagram of the internal structure of the full-sky polarization imaging polarimeter. The pixels with polarization angles of 90°, 45°, 135°, and 0° are represented by red, yellow, green, and blue, respectively. (b) Schematic of the pixel segmentation algorithm. According to the structure of the polarizer array, a superpixel is divided into four pixel units, which are represented by ${I_{out}}({90^ \circ })$, ${I_{out}}({45^ \circ })$, ${I_{out}}({135^ \circ })$, ${I_{out}}({0^ \circ })$, respectively. The pixel units with the same polarization angle are combined to form four images.
Fig. 4.
Fig. 4. Flowchart of the navigation algorithm for a partially obscured sky scene. (a) AOP distribution, (b) DOP distribution, (c) MASK diagram, (d) valid AOP distribution, (e) inpainted AOP distribution, (f) transform results in H space, and (g) extraction result of the solar meridian as a white solid line.
Fig. 5.
Fig. 5. Principle of inpainting AOP distribution through convolution. To make the principle easier to understand, the r in this diagram is 1.
Fig. 6.
Fig. 6. Experimental system.
Fig. 7.
Fig. 7. Influence of dense building obscurations on the skylight polarization pattern. (a) One of the original polarization images, (b) AOP distribution, (c) DOP distribution.
Fig. 8.
Fig. 8. Influence of dense tree obscurations on the skylight polarization pattern. (a) One of the original polarization images, (b) AOP distribution, (c) DOP distribution.
Fig. 9.
Fig. 9. Comparisons of AOP results before and after inpainting. From left to right, the figures show the extraction result before inpainting, H space before inpainting, the extraction result after inpainting, and H space after inpainting. The white solid lines are the extracted solar meridians. (a) Under mild obscuration, (b)–(d) under moderate obscuration, and (e)–(g) under severe obscuration.
Fig. 10.
Fig. 10. MASKs generated with different ${G_{\textrm{max} }}$ and the error rates of the MASKs. (a) One of the original polarization images; (b)–(d) MASKs generated when ${G_{\textrm{max} }}$ equals to 0.005, 0.017, and 0.025, respectively; (e) error rates of the MASKs depending on ${G_{\textrm{max} }}$.
Fig. 11.
Fig. 11. MASKs generated with different $DO{P_{\min }}$ and the error rates of the MASKs. The original polarization images are the same as that in Fig. 10. (a)–(d) MASKs generated when $DO{P_{\min }}$ equals to 0.01, 0.016, 0.02, and 0.025, respectively; and (e) error rates of the MASKs depending on $DO{P_{\min }}$.
Fig. 12.
Fig. 12. Streetlamps as a typical obscuration. (a) Polarization image, (b) AOP distribution, and (c) MASK.
Fig. 13.
Fig. 13. Inpainting results with different r values. (a) $r = 1$, (b) $r = 2$, (c) $r = 3$, (d) $r = 5$, (e) $r = 10$, and (f) $r = 15$.

Tables (1)

Tables Icon

Table 1. Statistical Results of the Comparative Experiments

Equations (9)

Equations on this page are rendered with MathJax. Learn more.

{ A O P = 1 2 arctan ( U Q ) D O P = Q 2 + U 2 I ,
{ I = I o u t ( 0 ) + I o u t ( 90 ) Q = I o u t ( 0 ) I o u t ( 90 ) U = 2 × I o u t ( 45 ) I o u t ( 0 ) I o u t ( 90 ) .
M A S K ( i , j ) = { 1 , D O P min   < D O P ( i , j )   < D O P max and  g r a d i e n t ( D O P ( i , j ) )   < G max   0 , other ,
A O P i n p a i n t i n g ( i , j ) = A O P p a d ( i , j ) M A S K p a d ( i , j ) s u m ( M A S K p a d ( i , j ) ) .
A O P p a d ( i , j ) = [ A O P ( i r , j r ) A O P ( i r , j + r ) A O P ( i + r , j r ) A O P ( i + r , j + r ) ] .
M A S K p a d ( i , j ) = [ M A S K ( i r , j r ) M A S K ( i r , j + r ) M A S K ( i + r , j r ) M A S K ( i + r , j + r ) ] .
A O P p a d ( i , j ) M A S K p a d ( i , j ) = m = 1 2 r + 1 n = 1 2 r + 1 A O P p a d ( i , j ) ( m , n ) M A S K p a d ( i , j ) ( m , n ) .
90 A O P i n p a i n t i n g ( i , j ) 89.8 or 89 . 8 A O P i n p a i n t i n g ( i , j ) 90 .
1 2 min { m , n }   < r   < max { m , n } .

Metrics

Select as filters


Select Topics Cancel
© Copyright 2022 | Optica Publishing Group. All Rights Reserved