Abstract
In this paper, we propose a novel formulation for building pixelwise alignments between remote sensing images under nonrigid transformation based on matching both sparsely and densely sampled features. Our formulation contains two coupling variables: the nonrigid geometric transformation and the discrete dense flow field. To match sparse features, we fit a geometric transformation specified in a reproducing kernel Hilbert space and impose a locally linear constraint to regularize the transformation. To match dense features, we compute a dense flow field by using a formulation analogous to scale invariant feature transform (SIFT) flow which allows nonrigid matching across different scene appearances. An additional term is introduced to ensure the coherence between the two variables, and we alternatively solve for one variable under the assumption that the other is known. Extensive experiments on both synthetic and real remote sensing images demonstrate that our approach greatly outperforms state-of-the-art methods, particularly when the data contain severe degradations.
© 2016 Optical Society of America
Full Article | PDF ArticleMore Like This
Chengyin Liu, Jiayi Ma, Yong Ma, and Jun Huang
J. Opt. Soc. Am. A 33(7) 1267-1276 (2016)
Mehdi Salehpour and Alireza Behrad
J. Opt. Soc. Am. A 34(10) 1865-1876 (2017)
Zhi-li Song, Sheng Li, and Thomas F. George
Opt. Express 18(2) 513-522 (2010)