IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, VOL. 12, NO. 4, APRIL 2015 711
A Robust Delaunay Triangulation Matching for
Multispectral/Multidate Remote Sensing
Image Registration
Ming Zhao, Bowen An, Yongpeng Wu, Member, IEEE, Boyang Chen, and Shengli Sun
Abstract—A novel dual-graph-based matching method is pro-
posed in this letter particularly for the multispectral/multidate
images with low overlapping areas, similar patterns, or large
transformations. First, scale invariant feature transform based
matching is improved by normalizing gradient orientations and
maximizing the scale ratio similarity of all corresponding points.
Next, Delaunay graphs are generated for outlier removal, and
the candidate outliers are selected by comparing the distinction
of Delaunay graph structures. In order to bring back the inliers
removed in Delaunay triangulation matching iterations and to
exclude the remaining outliers, the recovery strategy equipped
with the dual graph of Delaunay is explored. Inliers located in the
corresponding Voronoi cells are recovered to the residual s ets. The
experimental results demonstrate the accuracy and robustness of
the proposed algorithm for various representative remote sensing
images.
Index Terms—Delaunay triangulation (DT), graph matching,
image registration, multispectral/multidate images.
I. INTRODUCTION
I
MAGE registration has been widely applied to remote sens-
ing, computer vision, medical image processing, and pattern
recognition. It refers to the vital process of remapping the same
regions in two or more images and unifying them into the same
coordinate system [1]. Registration of multispectral/multidate
remote sensing images is still a present challenge for reliable
feature matching, where the pixel intensities in the same region
are quite different or even inverse [2]. It is more difficult to
achieve the effective registration for images with deformations
of rotations, scaling, and shear transformations. Another two
interfering factors are the low overlapping areas and similar
patterns that generally appeared in remote sensing images with
a large field of view.
Manuscript received June 26, 2014; revisedAugust17,2014andSeptember10,
2014; accepted September 11, 2014. Date of publication October 2, 2014; date
of current version October 31, 2014. This work was supported in part by the
National Natural Science Foundation of China under Grants 61302132 and
61171126, the Shanghai Educational Development Foundation under Grant
13CG51, the Shanghai Key Support Project under Grant 12250501500, and
the Ministry of Transportation Applied Basic Research Projects under Grant
2014329810060. (Corresponding authors: Ming Zhao and Boyang Chen.)
M. Zhao is with the Department of Logistics Engineering, Shanghai Mar-
itime University, Shanghai 201306, China (e-mail: mingzhao@shmtu.edu.cn).
B. An is with the Department of Information Engineering, Shanghai Mar-
itime University, Shanghai 201306, China (e-mail: bwan@shmtu.edu.cn).
Y. Wu is with the Institute for Digital Communications, Universität Erlangen-
Nürnberg, 91058 Erlangen, Germany (e-mail: yongpeng.wu@LNT.de).
B. Chen is with the Office of System Development, National Satellite
Meteorological Center, Beijing 100081, China (e-mail: chenby@cma.gov.cn).
S. Sun is with the Shanghai Institute of Technical Physics, Chinese Academy
of Sciences, Shanghai 200083, China (e-mail: palm_sum@sohu.com).
Color versions of one or more of the figures in this paper are available online
at http://ieeexplore.ieee.org.
Digital Object Identifier 10.1109/LGRS.2014.2359518
Many approaches have been proposed in the literature re-
garding the intensity similarity, feature spatial distribution, and
feature symbolic description. Scale invariant feature transform
(SIFT) [3] has been regarded as a powerful intensity-based
obtention for matching candidates. Despite being invariant to
the displacements of rotations and scaling, SIFT matching is
not sufficient for multispectral images with intrinsic difference
or multidate images with illumination changes. Gonçalves et al.
[4] presented an automatic image registration approach which
combines image segmentation and SIFT. Maximally stable ex-
tremal regions were utilized to detect more reliable regions for
significantly different spectral content. However, they just per-
form well on the condition that a suitable SIFT in the segmented
regions of multispectral images is extracted and matched by
reliable strategies. Feature spatial distribution combined with
feature similarity is taken into consideration in recent image
registration algorithms [5]–[8]. Aguilar et al. [5] and Zhao et al.
[8] developed graph transformation matching (GTM) and bilat-
eral K nearest neighbor (KNN) spatial orders around geometric
centers based on KNN graphs with limitations of distances
to remove outliers, respectively. However, the KNN constraint
is inadequate for the outliers with the same graph structure.
The restricted spatial orders constraints algorithm proposed by
Liu et al. [6] integrates spatial order constraints and trans-
formation error restrictions into KNN graph matching. The
weighted GTM proposed by Izadi et al. [7] utilizes angular
distances as matching weights to improve KNN descriptors.
Nevertheless, the comparison of complicated graph structures is
time-consuming, and many inliers may be removed arbitrarily
in early iterations of graph matching.
In this letter, an approach called robust Delaunay trian-
gulation matching (RoDTM) is proposed for multispectral/
multidate remote sensing image registration. First, an improved
SIFT is introduced to establish adequate one-to-one matches.
Then, Delaunay triangulation matching (DTM) is formulated
as comparison of Delaunay graph structures. Finally, Voronoi
derived from Delaunay graph is employed to recover the inliers
deleted in DTM iterations. Since that RoDTM employs tech-
niques based on unique spatial triangulations among improved
SIFT feature points, the performance of feature point matching
is significantly improved.
II. R
ODTM
A. SIFT Modifications for Remote Sensing Images
Assume that we have two input remote sensing images in
which s ignificant points have been extracted by SIFT-based
feature detections. The SIFT keypoints for the i nput images
1545-598X © 2014
IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission.
See http://www.ieee.org/publications_standards/publications/rights/index.html for more information.