HYPERSPECTRAL IMAGE CLASSIFICATION
BASED
ON SPECTRA DERIVATIVE
FEATURES AND LOCALITY PRESERVING ANALYSIS
Zhen Ye
yzh525@gmail.com
Mingyi He
myhe@nwpu.edu.cn
James E. Fowler
2
fowler@ece.msstate.edu
Qian Du
3
du@ece.msstate.edu
1 Shaanxi Key Lab. of Information Acquisition and Processing, School of Electronics and Information
Northwestern Polytechnical University, Xi'an, 710129, China
2 Geosystems Research Institute, Mississippi State University, Starkville, MS, USA
3 Department of Electrical and Computer Engineering, Mississippi State University, Starkville, MS, USA
ABSTRACT
High spectral resolution and correlation hinders the applica-
tion of traditional hyperspectral classification methods in the
spectral domain. To address this problem, derivative infor-
mation is studied in an effort to capture salient features of
different land-cover classes. Two locality-preserving dimen-
sionality-reduction methodsspecifically, locality-preserv-
ing nonnegative matrix factorization and local Fisher dis-
criminant analysisare incorporated to preserve the local
structure of neighboring samples. Since the statistical dis-
tribution of classes in hyperspectral imagery is often a com-
plicated multimodal structure, classifiers based on a Gaus-
sian mixture model are employed after feature extraction
and dimension reduction. Finally, the classification results in
the spectral as well as derivative domains are fused by a
logarithmic-opinion-pool rule. Experimental results demon-
strate that the proposed algorithms improve classification
accuracy even in a small training-sample-size situation.
Index Terms
—Spectral derivative, locality-preserving
analysis, hyperspectral image classification
1. INTRODUCTION
Hyperspectral imagery(HSI) is acquired and collected in
hundreds of contiguous spectral bands. Due to such highly
correlated and enormous data, the generalization capability
of statistical classifiers may be reduced[1]. Dimensionali-
ty-reduction algorithms are typically employed to address
this problem. Traditional approaches include unsupervised
methods such as principal component analysis (PCA), as
well as supervised methods, such as Fisher’s linear discrimi-
nant analysis (LDA) [2]. PCA is designed to minimize mean
square error between the original and reduced spaces while
LDA is designed to find a projection maximizing the class
separation in a lower-dimension space[3].
PCA and LDAassume that the class-conditional distri-
butions are Gaussian. However, real-life observational data
are often not Gaussian and, in extreme cases, may be mul-
timodal. In this work, two locality-preserving dimensionali-
ty-reduction methods based on locality-preserving
non-negative matrix factorization (LPNMF) [4] and local
Fisher's discriminant analysis (LFDA)[5] are studied to
exploit the rich statistical structure of HSI data.In[6], Li et
al. argued that LFDA-based dimensionality reduction fol-
lowed by Gaussian-mixture-model (GMM)[7] classifiers
effectively captures the underlying statistical structure accu-
rately and obtains superior classification performance. Al-
though GMM combined with LFDA (LFDA-GMM) has
proven to be an effective approach for classification, it still
suffers from the fact that the classifier exploits only spectral
information. In real applications, spectral reflectance is not
exactly identical to laboratory measurements due to, e.g.,
illumination and atmospheric effects [8].
This paper proposestwo classification methods called
D-LPNMF-GMM-Fusion and D-LFDA-GMM-Fusionwhich
are based upon spectral-derivative features and locali-
ty-preserving analysis. In this work, derivative information
of each pixel is extracted under different orders. LPNMF and
LFDAare used to reduce the dimension in the spectral and
derivative domains, respectively. Following this, GMM
classifiersare applied for obtaining local labels. Finally, the
classification results from the spectral and derivative do-
mains are fused by logarithmic-opinion-pool (LOGP) [9]
decision-fusion rule.The experimental results demonstrate
that theproposed system outperformstraditional algorithms,
includingLDA coupled with maximum likelihood estimation
(LDA-MLE), LPNMF combined with GMM
(LPNMF-GMM), LFDA-GMM [6], and a support vector
machine (SVM)[10].
The remainder of this paper is organized as follows.
Sec. II describes the two locality-preserving dimensionali-
ty-reduction techniquesLPNMF and LFDA. In Sec. III, we
describe the formulation of the proposed classification sys-
tem. A detailed discussion of the experimental results is pre-
sented in Sec. IV. Finally, Sec. V concludes the paper.
2. LOCALITY-PRESERVING ANALYSIS
2.1. LPNMF
LPNMF combines the advantages of non-negative matrix
factorization (NMF) and locality-preserving projection (LPP)
[11], which leads to a parts-based representation using only
138978-1-4799-5403-2/14/$31.00 ©2014 IEEE ChinaSIP 2014