Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Application of attitude jitter detection based on short-time asynchronous images and compensation methods for Chinese mapping satellite-1

Open Access Open Access

Abstract

Given the recent development in high-resolution (HR) optical satellites, the study of both attitude jitter (AJ) detection and compensation has become increasingly essential to improving the radiometric and geometric quality of HR images. A group of HR optical stereo mapping satellites in China, mapping satellite-1 (MS-1) has launched two satellites and will launch one satellite to build a satellite network. The geometric accuracy of the launched MS-1 satellites is greater than 80 m because of the AJ caused by the instability of the platform. AJ detection and compensation are critical issues that must be addressed to improve the accuracy of geo-positioning and mapping before launching a new satellite. The present study employs a method of jitter detection based on short-time asynchronous images to detect MS-1 jitter. The adjacent overlapping areas of an original panchromatic image are used as detection images instead of the traditional multispectral images, and a differential recursion optimal estimation filter is proposed for the optimal estimation and elimination of the gross errors of the registration data procedure, thereby increasing the detection accuracy. The space variant blurring model and viewing angles correction method are employed for the radiometric and geometric jitter compensation of images, respectively. The methods of radiometric objective evaluation indices and geometric checkpoint are then utilised to evaluate the quality of jitter compensation. Finally, the DeZhou regional image (ShanDong province, China) from MS-1 is used as the experimental data. Results for the AJ of MS-1 are analysed and reported for the first time. The assessment results obtained show that both radiometric and geometric qualities greatly increase after the jitter compensation procedure. Thus, the work of this study for jitter detection and compensation effectively addresses the jitter of MS-1 HR optical satellites.

© 2015 Optical Society of America

1. Introduction

High-resolution (HR) satellite images obtained from linear array charge-coupled device (CCD) sensors are widely used in various fields, such as surveying and agriculture [1–3]. Good geometric and radiometric performance is crucial in satellite data processing to fully utilise HR images [4–6]. Given the process requirements, the performance of HR satellites depends on the accuracy of spacecraft position and attitude [7]. The accuracy and recorded frequency of attitude data depend on only their receiver, such as star-tracker and gyro, unlike the multiple improving accuracy method for the position data recorded using on-board global position system (GPS) (e.g., using real-time service [IGS] data [8]). Accurate attitude estimation is therefore a crucial issue in improving the accuracy of geo-positioning and mapping using HR satellites.

The accuracy of both geometric and radiometric aspect diminishes when the satellite experiences attitude jitter (AJ), which refers to the instability and distortion of attitude data. Tong et al. shows that low frequent AJ is a vital factor that affects the geometric accuracy of HR satellites [9], while high frequent AJ causes an image blur, which is an important aspect of radiometric deterioration. However, AJ detection and compensation are difficult when the attitude data from the star-tracker and gyro receivers are directly utilised. The main reason for this difficulty is that the sample rate of star-tracker and gyro is lower than the frequency of satellite AJ for the majority of remote-sensing satellites currently in use. Given the development in the high-accuracy registration method, a class of AJ detection methods which use the registration of short-time asynchronous satellite images has been proposed. Teshima and Iwasaki used the short-wave infrared sensors of advanced space-borne thermal emission and reflection radiometer (ASTER) to detect and estimate the frequency and amplitude of AJ [6] and found an AJ with a frequency of about 1.5 HZ. Mattason et al. succeeded in acquiring jitter using a pair of narrow-angle cameras onboard a lunar reconnaissance orbiter (LRO) [10]. Amberg et al. similarly utilised the adjacent band images of multispectral sensors in a PLEIADES-HR satellite to detect jitter without using additional devices [11]. Tong et al. developed a framework of jitter detection and compensation [9] by adopting the registration method and using the multispectral sensors of ZY-3 and the narrow-angle camera of LRO cameras (LROC-NAC) of LRO satellite images through experiments.

Mapping satellite-1 (MS-1) is a group of HR stereo mapping satellites in China, whose first satellite (code: 01) was launched on August 24, 2010. A sister satellite (code: 02) was launched on May 6, 2012, and a new satellite will be launched in 2015. MS-1 has three line array sensors (TLA), in which the ground sample distance (GSD) of the nadir image is about 5 m, has an HR panchromatic sensor (PAN) with a GSD of 2 m and a multispectral sensor (MUX) with a GSD of 10 m, which operate in four spectral bands. The detail parameters of satellite and sensors of MS-1 are listed in Table 1. MS-1 consists of equipment satellites that have the same sensors and platform, and multiple satellites constitute a satellite network that enhances the terrain observation ability. Similar to other HR satellite, MS-1 are also suffered from AJ. The geometric accuracy of 01 and 02 of MS-1 is worse than 80 m because of the AJ caused by the instability of the platform. Research on AJ detection and compensation is therefore a crucial issue that must be addressed before launching a new satellite. In summary, high-accuracy AJ detection and compensation are vital in improving the prospects of MS-1.

Tables Icon

Table 1. Parameters of MS-1.

The present study fully exploits the registration method for the AJ detection of MS-1. The multi-chip CCD arrays of the MS-1 satellite from MUX and PAN are constructed in the sensor based on the mechanical mode, which means that a change in the geometric relationship occurs in the CCD stitching process because of the spatial interval between the CCD arrays. It also means that the implicit AJ information should be interfered by stitching processing. Thus, AJ detection based on the registration method in the CCD-stitched image (e.g., MUX L1A images) cannot guarantee the accuracy of jitter detection. Therefore, the original images of the adjacent-chip CCD array from the panchromatic sensor are directly used, following the characteristics of the MS-1 sensor, to prevent the deterioration of the detected accuracy caused by the CCD stitching processing. PAN sensor consists of a telescope and eight CCD arrays, which theoretical interval of in-track is 2014 pixels and overlap of cross-track is 96 pixels in the adjacent CCD arrays, as shown in Fig. 1. Therefore, the overlapping images of eight CCD arrays are used. Consequently, the space variant blurring model (SVBM) and the viewing angles correction method (VACM) are used to compensate for the geometric and radiometric deterioration of AJ. Finally, the methods of radiometric objective evaluation indices and geometric checkpoint are utilised to evaluate the quality of jitter compensation

 figure: Fig. 1

Fig. 1 The layout and overlap of CCD arrays in PAN sensor of MS-1.

Download Full Size | PDF

As a research application on the AJ detection and compensation for HR satellites, this study makes several contributions to the field: (1) A systemic process of jitter detection, compensation and assessment for HR satellites is developed. The results of AJ for MS-1 are analysed and reported for the first time. (2) The accuracy of jitter detection is improved, and a differential recursion optimal estimation filter (DIROEF) is proposed to complete the process of eliminating false and abnormal matching results and optimal value estimation. (3) An initial study is conducted on the jitter compensation for enhancing the radiometric and geometric qualities of images. An investigation is also conducted on the suitable objective evaluation index to evaluate the image quality with or without AJ compensation. We believe this study makes exemplary contributions to the field of jitter detection and compensation of HR satellites.

2. Methods

2.1 Jitter detection [9]

Jitter detection based on short-time asynchronous satellite images utilises the small parallax that forms as a result of jitter [12]. For example, four short-time asynchronous CCD arrays capture images at different times, as shown in Fig. 2. AJ may cause some small changes in the satellite attitude during the interval, thereby causing the formation of parallax in the images. As a result, if an accurate registration algorithm is used to match the ground target of a pair of asynchronous images, the jitter can be reconstructed with the help of the registration results.

 figure: Fig. 2

Fig. 2 Diagram of the jitter effect on images captured by the short-time asynchronous CCD arrays.

Download Full Size | PDF

The parallax disparity φ(t)of each pixel in both along- and cross-track directions can be measured based on the overlapping area between the two images. This relationship is described as Eq. (1).

φ(t)=f(t+Δt)f(t).

where t is the current imaging time, t represents the along-track time interval that the parallax causes, and f(t) is the jitter displacement at time t, which refers to the image distortion that AJ causes.

The detection method based on short-time asynchronous satellite images consists of four essential steps:

  • (1) Co-registration of the overlapping region is performed to remove the entire shift between the two images.
  • (2) Registered image pairs are subjected to dense and accurate matching based on the pixel-level phase correlation algorithm [13].
  • (3) The false and abnormal matching results are eliminated, and the true value of the parallax disparity is estimated. In this study, DIROEF is proposed to complete this process, which is described in the next section.
  • (4) The jitter displacement is acquired; a spectral analysis is conducted to acquire the frequency of the jitter. The amplitude is then retrieved from the amplitude of the parallax disparity with the help of the phase difference from t in Fourier space [11].

2.2 DIROEF

2.2.1 Basic equation

DIROEF is a sequential linear estimator that provides a near-optimal solution to the optimal estimation of given data sets with noise, certain proportion of gross errors and multiple local characteristics. The method can also eliminate the gross errors of given data sets by using a simple strategy. The accuracy of this method depends on the variance of noises and the variance of adjacent data differences, which are optimally estimated by classical algorithms. The core equation is shown in Eq. (2) and Eq. (3).

Ye(xi)=Ye(xi1)+P(xi1)+σR2P(xi1)+σn2+σR2*(yiYe(xi1)),
P(xi)=σn2P(xi1)+σn2+σR2*(P(xi1)+σR2).

Where the optimal estimation of observed data and its single estimator variance denoted byYe(xi) and P(xi) respectively; subscriptiis the index of data in the given data sets and e stands for estimation; the variance of observed noise and the variance of adjacent data difference (dyi=yiyi1) denoted by σn2andσR2respectively. The magnitude of σR2 is relative to the density and continuity of the function sample (or given data set). If the function sample (or given data set) is sufficiently dense and continuous, particularly those samples at the knots, then σR2 is subtle.

Therefore, DIROEF is a sequential linear estimator wherein the point-wise estimate Ye(xi) is calculated by using the gradually calculatedP(xi).

2.2.2 Parameters calculation

The parameters in DIROEF prior to calculation include the initial optimal estimation of observed data and sample estimator variance (denoted by Ye(x1) and P(x1), respectively), as well as the variance of observed noise and the variance of adjacent data difference.

The initial optimal estimation of observed data and sample estimator variance is re-substituted as shown in Eq. (4).

Y'(x1)=y1,P(x1)=1.

The variance of observed noise is calculated by using a classical technique that determines the mode of the probability distribution of the data variance estimated over signal segments. According to the Eq. (5), local variance LVi is calculated at intervals of N.

LVi=j=1N(yj1Nj=1Nyj)2N.

After sorting the data set of the local variance in ascending order, we divide it into M groups according to value as follows:

Where #{x} represents the number of elements that belong to a certain range. The group with the highest number is called the local variance mode (LVM). The variance of the observed noise is equal to the group mean of LVMs, as shown in Eq. (6) and Eq. (7).

LVM=Max(#{LV}j,j[1,M]),
σn2=Mean(LVM).

The variance of adjacent data difference σR2 is calculated as shown in Eq. (8)-(10).

dyi=yiyi1,
Meandy=1N1i=2Ndyi,
σR2=1N1i=2N(dyiMeandy).

2.2.3 Elimination strategy of gross errors

Gross error elimination is conducted by the estimation process in Eq. (2). The DIROEF equation decreases the effects of gross errors, thus resulting in large variances in estimation. Therefore, optimal estimations that use DIROEF are close to the large group of observed data. This characteristic is used to eliminate gross errors.

As shown in Eq. (11), we further calculate the difference between the estimated and observed data and then set the third mean error as the threshold after calculating the estimation of the observed data by using Eq. (2) and Eq. (3).

δ={0,if|yiY'(xi)|3*(σn2+σR2)1,if|yiY'(xi)|>3*(σn2+σR2).

Where δ is the flag of each observed data. The point that included the value of δ is one considered a gross error. The third mean error is the best threshold for determining gross errors from data sets.

Therefore, DIROEF is best used to gradually estimate the optimal function or value through the iterative use of Eq. (2) and the elimination of gross errors. We can use the remaining data after eliminating the gross errors to re-calculate the estimation close to the optimal results by iteration. The ending iteration condition is that all differences between the remaining points and their estimations are less than3*(σn2+σR2).

2.3 Jitter compensation

2.3.1 SVBM for radiometric compensation

Integral sampling and jitter diminish the radiometric quality of CCD images. We assume that f(m,n) is the image after radiometric compensation and g(x,y) is the original image, as shown in Eq. (12).

f(m,n)=0Ma0Naδ(xma)δ(yma)[g(x,y)*PSFCamera*PSFvibration]dxdy.

where δ(xma)δ(yma) is the shock impact function, m and n are pixel numbers in the along- and cross-track directions, respectively, and PSFCamera and PSFvibrationare the point spread function (PSF) deteriorations that integral sampling and jitter cause, respectively.

PSFCamera is defined as Eq. (13) [14]:

PSFcamera=rect(xa,ya)*rect(xa).

where rect is expressed as a rectangle function.

Cross-track jitter causesPSFvibration, which can be expressed as Eq. (14) [15]:

PSFvibration(x,y)={1/L,x=0,1,L10,others.

where L is the jitter value in the cross-track direction. The PSFvibration in the along-track directions is similar to those in the cross-track directions. Thus, if the jitter in both the along- and cross-track directions are acquired, PSFvibration can be calculated, and the deterioration that jitter causes can be compensated using Eq. (8).

2.3.2 VACM for geometric compensation

Schwind et al. [16] shows that AJ can be modelled using the following cumulative cosine functions, as Eq. (15).

f(t)=i=1NA0icos(2πωit+f0i).

whereA0i, ωi and f0i are the amplitude, frequency and phase of the ith component of the jitter information, respectively, and t corresponds to the imaging time.

In geometric compensation, the influence of jitter on satellite attitude can be treated through the correction of the CCD viewing angles [9], as shown in Eq. (16) and Eq. (17):

Δψx=fx(t)fc,
Δψy=fy(t)fc.

where Δψx and Δψy are the corrections of the viewing angles in the x- and y-directions of the image, fx(t) and fy(t) are the corresponding detected jitter values, and fc is the focal length of the camera. The corrections from jitter geometric compensation are regarded as additional terms to the original viewing angles. The corrected viewing angles can be applied in the following geometric processing [17].

2.4 Assessment for jitter compensation

2.4.1 Radiation assessment

The objective evaluation method is used in this study to evaluate the effect of jitter compensation on the radiometric aspect. Some radiometric objective evaluation indices (OEI) are selected as the assessment criteria; these indices include Clarity (CLA), Detail-Energy (DET), Edge-Energy (EDG) and Contrast (CON), as described as follows.

(1) CLA may well reflect the quality of the image, as shown in Eq. (18); the image represented by higher CLA values has more clarity in the objection expression than that represented by lower CLA values.

CLA=ab(df/dx)2/|(f(b)f(a)|.

where df/dx and f(b)f(a) are the grey rate and contrast degree in the direction perpendicular to the edge, respectively.

(2) In the high-frequency components of the frequency domain, DET and EDG describe the detail and edge characteristic of the image, respectively. The image obtains more information as the values of the DET and EDG increase.

DET=1nσf2(x,y),
σf2(x,y)=1(2M+1)2i=MMj=MM[f(x+i,y+j)mf(x,y)].

where σf2(x,y) is the regional variance, mf(x,y) is the regional mean, and the value of M is often set to 1.

EDG=1mnx=1my=1ne2(x,y),
e(x,y)=E1(f(x,y))+E2(f(x,y)),

where E represents the convolution operation of an image as described by

E1=[1/61/61/61/64/61/61/61/61/6],E2=[1/61/61/61/64/61/61/61/61/6].

(3) CON is an important OEI for the image radiometric quality, as shown in Eq. (24). The information and texture of an image become more obvious as CON, which reflects the interpretability between the target and background becomes higher.

CON=n=0L1n2{i=0L1j=0L1p^(i,j)}.

wheren=|ij|, and p^(i,j) is the normalised grey-level co-occurrence matrices.

2.4.2 Geometric assessment

The assessment indicators of jitter geometric compensation are generally analysed through a comparison between the geometric correction image with compensation process and that without compensation process. This study uses the checkpoint method to evaluate the accuracy of the geometric correction image. The numerical accuracy index of the geometric correction image is usually obtained by comparing the evaluation of the imagery points and that of the actual observed values. The root mean square error (RMSE) of the checkpoints and corresponding imagery points are calculated; it reflects the discrete degree of the sampling point elevation value and true value.

RMSE=i=1n(ZiZc)2n

3. Experiment and data analysis

3.1 Data introduce and data processing operation

The experimental data in this study consist of the complete MS-1 data which cover the Dezhou region of ShanDong province, China and are within the scope of a complete image of the original panchromatic data. Dezhou has a large plain terrain where the elevation difference is less than 100 m. A plain terrain is chosen to eliminate the effect of the registration error caused by a difference in the terrain elevation, which is formed in Eq. (26).

Δr=(tan(θi+α)tan(θj+α))ΔHH+ΔHfpixelsize

where H is orbit altitude, f is the camera focal length, α is the satellite pitch angle, θi and θj are the field angles of the two sensors, respectively. Given the MS-1 satellite parameters, the Δr of each of the CCD arrays of the panchromatic sensor can thus be neglected when the elevation difference is less than 100 m.

Figure 3(a) shows the MS-1 scanning mode, where S(p1) and S(P2) are the satellite in two sequential imaging times; Satellite move from S(p1) to S(P2); Black line and red line are the photography region of PAN sensor in the time of S(p1) and S(P2), respectively; P is a ground target, which can be captured by CCD-1 in the time of S(p1) and by CCD-2 in the time of S(p2). Thus, the ground terrain in the stripe 1 can be imaged both in the CCD-1 and CCD-2 in the different imaging time. Those overlapping image of stripe are used to detect AJ. Figure 3(b) and 3(c) show the experimental panchromatic image and the partial image of an intercepted overlapping region between the adjacent CCD arrays, respectively.

 figure: Fig. 3

Fig. 3 Experimental data. (a) MS-1 scanning mode; (b) experimental panchromatic image; (c) the partial image of an intercepted overlapping region between the adjacent CCD arrays.

Download Full Size | PDF

The data processing operation is listed as following:

  • 1) Adjacent CCD images are subjected to intercept overlapping images.
  • 2) Overlapping image pairs are subjected to co-registration and dense matching based on the pixel-level phase correlation algorithm.
  • 3) The false and abnormal matching results are eliminated, and the true value of the parallax disparity is estimated by use of DIROEF method.
  • 4) Acquire jitter information by jitter analysis, such as amplitude and frequency.
  • 5) Perform radiometric compensation and geometric compensation by using jitter information, SVBM and VACM method.
  • 6) Assessment for jitter compensation.

3.2 Results of jitter detection

Using the pixel-level phase correlation algorithm [14], jitter in both the cross- and along-track directions can be detected. Although jitter appears in both cross- and along-track directions, the values of the parallax in the along-track direction are more striking than those in the cross-track of the MS-1 satellite. This phenomenon occurs because the jitter components in the cross-track direction are small; thus, these components can hardly be identified through the adopted matching method. In the consequent experiment, we neglect the jitter in the cross-track directions and mainly detect and compensate those in the along-track directions. The residual results calculated through the matching points of eight CCD arrays are shown in Fig. 4(a). The jitter observation curve can be obtained with the help of DIREOF, as shown in Fig. 4(c) (black line) and Fig. 4(d).

 figure: Fig. 4

Fig. 4 Results of jitter detection: (a) results of residuals, (b) results from the first iteration of DIROEF, (c) results from the last iteration of DIROEF, and (d) the jitter curve.

Download Full Size | PDF

3.3 Results of jitter analysis

To further use the jitter data, we assume that the jitter value per pixel is composed of a number of jitter information with different amplitudes, frequencies and phases. We adopt an analysis method from low to high frequency to gradually calculate the information of each jitter component; using the jitter data, the method first calculates the low-frequency jitter information and then the high-frequency jitter information and subtracts the previously analysed results. The blue line in Fig. 5(a) represents the low-frequency jitter with an amplitude of 0.1 pixels and a frequency of 0.105 HZ. When the low-frequency data are subtracted from the jitter data, a jitter with an amplitude of 0.05 pixels and a frequency of 0.635 HZ can be found from the remaining data, as shown in Fig. 5(c) (blue line). Finally, the jitter component shown in the Fig. 5(e) has an amplitude of 0.05 pixels and a frequency of 4 HZ. Note that the interval time of each line is approximately 0.8ms.

 figure: Fig. 5

Fig. 5 Jitter analysis: (a) low-frequency jitter, (b) remaining data when the low-frequency data are subtracted from the jitter data, (c) middle-frequency jitter, (d) remaining data when the low- and middle-frequency data are subtracted from the jitter data, (e) high-frequency jitter.

Download Full Size | PDF

3.4 Results of jitter compensation and assessment

Radiometric compensation was performed with the help of the compensation method discussed in Sec. 2.3 and the jitter information presented in Sec. 3.3. Several typical feature terrains were selected, such as city region, farmland, edge water and bush, to show the effect of this compensation. The visual impression illustrates that the compensation image becomes clearer than the original image, as shown in Fig. 6.

 figure: Fig. 6

Fig. 6 Visual impression of the image: (a–f) region image with (down) or without (up) jitter compensation.

Download Full Size | PDF

The index value of each region image was consequently obtained through the radiometric OEI described in Sec. 2.4.1, as shown in Table 2. The results show that the objective indices of the compensation image of different typical feature terrains were greatly improved. These results are consistent with the visual impression.

Tables Icon

Table 2. Objective indices of the region image with or without jitter compensation.

The jitter information was used in the corrections of the CCD viewing angles with the help of the compensation method in Sec. 2.3 and the jitter information in Sec. 3.3. After geometric processing, 54 control points were selected; the distribution of these points is shown in Fig. 7 for the evaluation of geometric quality. The RMSE of the checkpoints and that of the corresponding imagery points are calculated, as proposed in Table 3. Results illustrate that the geometric accuracy in the along-track directions improves when the jitter compensation is added to the geometric processing. After jitter compensation, the magnitude of the accuracy of the along-track directions becomes similar to that in the cross-track directions. (Because jitters in the cross-track directions of the MS-1 satellite are smaller than those in the along-track direction, the geometric accuracy in the cross-track directions is better than that in the along-track direction in the original data.) The results demonstrate that along-track jitters are all well compensated.

 figure: Fig. 7

Fig. 7 Distributions of the 54 control points.

Download Full Size | PDF

Tables Icon

Table 3. RMSE of the checkpoints with and without jitter compensation (m)

For efficiency assessment, we conduct AJ detection and compensation on the computer of i7-2600, 16G RAM, and record processing time. The computation time of jitter detection and compensation for an image are 46.78s and 23.12s, respectively. It demonstrates that the computation time of our method is inexpensive.

4. Discussion

The main source of AJ is platform vibration, which refers to the movement of moving parts (e.g. Momentum wheels of satellite). Thus, AJ of the satellite always exists. In the past, since the resolution of sensor is not very high, the deterioration of imaging quality caused by AJ is not obvious. Given the recent development in high-resolution (HR) optical satellites, the impact of AJ increasingly becomes apparent. The direct method of AJ estimation is to install high-frequency attitude measuring devices in satellite, such as angular velocity transducer, angular displacement transducer, etc. Some scholars have studied on this field. However, the universality and applicability of this method is limited. The main reason for this limitation is that those devices rarely installed on the satellite because of its high cost. As we known, the majority of remote-sensing satellites currently in use do not equipped with those devices. Comparatively, sensors which can obtain short-time asynchronous images (e.g. multispectral sensor, panchromatic sensor consisted of multi-CCD arrays) usually are the standard device of HR satellites. Therefore, our proposed method is a general method which can applicably use in other HR satellite arrangement. Subtle difference in processing is that we should take image size and interval of asynchronous images into consideration in other HR satellite. In MS-1 satellite, the high-frequency attitude measuring devices are also not equipped. Thus, our proposed AJ detection method is merely optional and effective method.

In our method, the accuracy of AJ estimation mainly depends on the accuracy of registration. Thus, the images including the ground target which has the characteristics of spatially uniform should not be chosen to detect AJ, in order to reduce registration error as much as possible. Certainly, if the numbers of false matching points is much less than correct ones (The proportion of false matching points is less than 20%), the proposed DIROEF method can work well and obtained expected results. By the way, a relatively plain terrain also needs to be chosen to eliminate the effect of the registration error caused by a difference in the terrain elevation. In summary, the high-accuracy registration can guarantee the accuracy of AJ detection and result in excellent AJ compensation results. Due to applicability of employed phase correlation to sub-pixel registration method, it is proved that our proposed correction method can be applied for multispectral sensor, including red, green, blue, NIR spectral range, and panchromatic sensor. However, future studies are recommended on the other spectral range and amplitude, such as hyper-spectral sensor.

With the help of our proposed method, we further detected AJ in other images captured by panchromatic sensor of MS-1 in different time, and we find that the frequency and amplitude of AJ is unstable in a long period of time. The main reason for this instability is that mutative satellite vibration caused by irregular movement of satellite moving parts should change the frequency and amplitude of AJ. However, the experiments also express that the frequency and amplitude of AJ is stable in a short period of time, such as in a period of one stripe image (including 120 scene images). Therefore, the problem of instability of AJ in a long period of time can be solved by regular AJ detection. In actual processing system, due to inexpensive computation time of our method, regularly AJ detection dose not impact on the whole efficiency of image processing of MS-1.

5. Conclusion

This study presented an approach to the jitter detection, compensation and result assessment of an MS-1 satellite. Jitter detection based on short-time asynchronous satellite images was adopted because of the structure of the multiple CCD arrays and the fixed interval of the adjacent arrays in the panchromatic sensor. A new mathematical method named DIROEF was proposed and played an important role in eliminating the false or abnormal matching points of registration processing, estimating the true value of the jitter, and thereby improve the detection accuracy. After analysing the jitter information, the SVBM and VACM method were used to compensate for the radiometric and geometric effects of jitter on the image, respectively. The radiometric objective evaluation and RMSE of checkpoint methods were then utilised to evaluate the radiometric and geometric compensation effect, respectively. The results for the experimental data demonstrated that our framework for jitter detection and compensation effectively addresses the jitter of an MS-1 satellite. This research is an application of the satellite jitter process technology and has a role in the technical foundation of the new Chinese MS-1 satellite launch. We believe this framework can be considered a kind of pre-processing before the quantitative study and should therefore be useful for other land-observed satellites.

Acknowledgments

The authors thank the editors and the reviewers for their constructive and helpful comments for substantial improvement of paper, and also thank the TH-Centre of China for providing experimental image and data. This research is financially supported by the State Key Program of National Natural Science Foundation of China (Grant No.61331017); the High Resolution Earth Observation Systems of National Science and Technology Major Projects (Grant No.05-Y30B02-9001-13/15-03) and Chinese National Natural Science Foundation (Grant No.41001214).

The first author conceived the study, proposed the DIROEF method and performed the experiments; the second author helped perform the analysis with constructive discussions; the third author helped provide the experimental data and develop technical flow of the method; the fourth author wrote the project program and contributed to manuscript preparation;

References and links

1. M. L. Gao, W. J. Zhao, Z. N. Gong, H. L. Gong, Z. Chen, and X. M. Tang, “Topographic Correction of ZY-3 Satellite Images and Its Effects on Estimation of Shrub Leaf Biomass in Mountainous Areas,” Remote Sens. 6(4), 2745–2764 (2014). [CrossRef]  

2. Q. Liu, Y. Zhang, G. Liu, and C. Huang, “Detection of quasi-circular vegetation community patches using circular Hough transform based on ZY-3 satellite image in the Yellow River Delta, China,” In IGARSS. 2149-2151 (2013).

3. S. Yu, M. Berthod, and G. Giraudon, “Toward robust analysis of satellite images using map information-application to urban area detection,” IEEE Trans. Geosci. Remote Sensing 37(4), 1925–1939 (1999). [CrossRef]  

4. D. S. Lee, J. C. Storey, M. J. Choate, and R. W. Hayes, “Four years of Landsat-7 on-orbit geometric calibration and performance,” IEEE Trans. Geosci. Rem. Sens. 42(12), 2786–2795 (2004). [CrossRef]  

5. J. C. Storey, M. J. Choate, and D. J. Meyer, “A geometric performance assessment of the EO-1 advanced land imager,” IEEE Trans. Geosci. Rem. Sens. 42(3), 602–607 (2004). [CrossRef]  

6. Y. Teshima and A. Iwasaki, “Correction of attitude fluctuation of Terra spacecraft using ASTER/SWIR imagery with parallax observation,” IEEE Trans. Geosci. Rem. Sens. 46(1), 222–227 (2008). [CrossRef]  

7. C. C. Liu, “Processing of FORMOSAT-2 daily revisit imagery for site surveillance,” IEEE Trans. Geosci. Rem. Sens. 44(11), 3206–3214 (2006). [CrossRef]  

8. H. M. Peng, C. S. Liao, and J. K. Hwang, “Performance testing of time comparison using GPS-smoothed P3 code and IGS ephemerides,” IEEE Trans. Instrumentation and Measurement 54(2), 825–828 (2005). [CrossRef]  

9. X. Tong, Z. Ye, Y. Xu, X. Tang, S. Liu, L. Li, H. Xie, F. Wang, T. Li, and Z. Hong, “Framework of Jitter Detection and Compensation for High Resolution Satellites,” Remote Sens. 6(5), 3944–3964 (2014). [CrossRef]  

10. S. Mattson, M. Robinson, A. McEwen, A. Bartels, E. Bowman-Cisneros, R. Li, J. Lawver, T. Tran, K. Paris, and Lroc Team, “Early Assessment of Spacecraft Jitter in LROC-NAC,” In Proceedings of the 41st Lunar and Planetary Institute Science Conference, The Woodlands, TX, USA, 1–5 March 2010.

11. V. Amberg, C. Dechoz, L. Bernard, D. Greslou, F. de Lussy, and L. Lebegue, “In-flight attitude perturbances estimation: Application to PLEIADES-HR satellites,” Proc. SPIE 8866, 886612 (2013). [CrossRef]  

12. A. Iwasaki, “Detection and Estimation Satellite Attitude Jitter Using Remote Sensing Imagery,” In Advances in Spacecraft Technologies, Hall, J., Ed., InTech: Rijeka, Croatia, 13, 257–272 (2011).

13. H. Foroosh, J. B. Zerubia, and M. Berthod, “Extension of phase correlation to subpixel registration,” IEEE Trans. Image Process. 11(3), 188–200 (2002). [CrossRef]   [PubMed]  

14. T. B. Ma, Y. F. Guo, and Y. F. Li, “Precision of row frequency of scientific grade TDICCD camera,” Opt. Precision Eng. 18(9), 2028–2035 (2010).

15. B. Dhanasekar and B. Ramamoorthy, “Restoration of blurred images for surface roughness evaluation using machine vision,” Tribol. Int. 43(1–2), 268–276 (2010). [CrossRef]  

16. P. Schwind, R. Müller, G. Palubinskas, and T. Storch, “An in-depth simulation of En MAP acquisition geometry,” ISPRS J. Photogramm. Remote Sens. 70, 99–106 (2012). [CrossRef]  

17. S. Liu, C. S. Fraser, C. Zhang, M. Ravanbakhsh, and X. Tong, “Geo-referencing performance of THEOS satellite imagery,” Photogramm. Rec. 26(134), 250–262 (2011). [CrossRef]  

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (7)

Fig. 1
Fig. 1 The layout and overlap of CCD arrays in PAN sensor of MS-1.
Fig. 2
Fig. 2 Diagram of the jitter effect on images captured by the short-time asynchronous CCD arrays.
Fig. 3
Fig. 3 Experimental data. (a) MS-1 scanning mode; (b) experimental panchromatic image; (c) the partial image of an intercepted overlapping region between the adjacent CCD arrays.
Fig. 4
Fig. 4 Results of jitter detection: (a) results of residuals, (b) results from the first iteration of DIROEF, (c) results from the last iteration of DIROEF, and (d) the jitter curve.
Fig. 5
Fig. 5 Jitter analysis: (a) low-frequency jitter, (b) remaining data when the low-frequency data are subtracted from the jitter data, (c) middle-frequency jitter, (d) remaining data when the low- and middle-frequency data are subtracted from the jitter data, (e) high-frequency jitter.
Fig. 6
Fig. 6 Visual impression of the image: (a–f) region image with (down) or without (up) jitter compensation.
Fig. 7
Fig. 7 Distributions of the 54 control points.

Tables (3)

Tables Icon

Table 1 Parameters of MS-1.

Tables Icon

Table 2 Objective indices of the region image with or without jitter compensation.

Tables Icon

Table 3 RMSE of the checkpoints with and without jitter compensation (m)

Equations (26)

Equations on this page are rendered with MathJax. Learn more.

φ(t)=f(t+Δt)f(t).
Y e ( x i )= Y e ( x i1 )+ P( x i1 )+ σ R 2 P( x i1 )+ σ n 2 + σ R 2 *( y i Y e ( x i1 )),
P( x i )= σ n 2 P( x i1 )+ σ n 2 + σ R 2 *(P( x i1 )+ σ R 2 ).
Y ' ( x 1 )= y 1 , P( x 1 )=1.
L V i = j=1 N ( y j 1 N j=1 N y j ) 2 N .
LVM=Max(# {LV} j ,j[1,M]),
σ n 2 =Mean(LVM).
d y i = y i y i1 ,
Mea n dy = 1 N1 i=2 N d y i ,
σ R 2 = 1 N1 i=2 N (d y i Mea n dy ) .
δ={ 0, if | y i Y ' ( x i ) |3*( σ n 2 + σ R 2 ) 1, if | y i Y ' ( x i ) |>3*( σ n 2 + σ R 2 ) .
f(m,n)= 0 Ma 0 Na δ(xma)δ(yma)[g(x,y)*PS F Camera *PS F vibration ]dxdy .
PS F camera =rect( x a , y a )*rect( x a ).
PS F vibration (x,y)={ 1/L,x=0,1,L1 0,others .
f(t)= i=1 N A 0i cos(2π ω i t+ f 0i ) .
Δ ψ x = f x (t) f c ,
Δ ψ y = f y (t) f c .
CLA= a b (df/dx) 2 /| (f(b)f(a) | .
DET= 1 n σ f 2 (x,y) ,
σ f 2 (x,y)= 1 (2M+1) 2 i=M M j=M M [ f(x+i,y+j) m f (x,y) ] .
EDG= 1 mn x=1 m y=1 n e 2 (x,y) ,
e(x,y)= E 1 (f(x,y))+ E 2 (f(x,y)),
E 1 =[ 1/6 1/6 1/6 1/6 4/6 1/6 1/6 1/6 1/6 ], E 2 =[ 1/6 1/6 1/6 1/6 4/6 1/6 1/6 1/6 1/6 ].
CON= n=0 L1 n 2 { i=0 L1 j=0 L1 p ^ (i,j) } .
RMSE= i=1 n ( Z i Z c ) 2 n
Δr=(tan( θ i +α)tan( θ j +α)) ΔH H+ΔH f pixelsize
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.