Abstract-----Hyperspectral images (HSIs) own inherent
complexity, so, clustering for HSIs is a very challenging
task. In this paper, we utilized a semi-supervised
subspace clustering method based on non-negative
low-rank representation (NNLRR) algorithm for HSI
clustering. Firstly, NNLRR used Gaussian fields and
harmonic functions into the low-rank representation
(LRR) model. Secondly, NNLRR guided the affinity
matrix construction by the supervision information. Next,
finding a non-negative low-rank matrix, the matrix
represents each sample by some other linear combination
points, and the affinity matrix is obtained by the matrix.
Then, accomplishing the affinity matrix construction and
subspace clustering simultaneously. Thanks for the
unification of the two steps, we can guarantee the overall
optimum. Experimental results on classical data set show
that, the algorithm is effective for hyperspectral image
clustering.
Index Terms: hyperspectral image, non-negative
low-rank representation, Gaussian fields and harmonic
functions
I. I
NTRODUCTION
HSIs have a wide application in our real lives [1-3],
such as urban planning, surveillance, agriculture, and so on.
However, as everyone knows, it is a little difficulty to
process the HSIs data, because of its high dimensionality,
which is prone to cause the curse of the dimensionality [4
,5].
Recently, for HSIs, various clustering methods with
different work mechanisms have been proposed. For example,
the k-means [6
] algorithm, which is a classically
unsupervised learning method, is applied for HSIs clustering.
However, the k-means is represented by centroid-based
clustering, which can be hampered by their large
computational complexity. In addition, the Fuzzy c-means
(FCM) [7
] is also widely used for HSIs. FCM is improved by
traditional clustering algorithm, but it is sensitive to the
1 Corresponding author
†
The first author.
J. Yang is with the College of Electrical Engineering and Automation,
Anhui University, Hefei, Anhui 230601, China.
Q. Yan is with the College of Computer Science and Technolog, and
College of Electrical Engineering and Automation, Anhui University, Hefei,
Anhui 230601, China(e-mail: 21429717@qq.com).
D. Zhang is with the College of Electrical Engineering and Automation,
Anhui University, Hefei, Anhui 230601, China.
T. Li is with the College of Electrical Engineering and Automation,
Anhui University, Hefei, Anhui 230601, China.
Y. Wang is with the College of Computer Science and Technolog, Anhui
University, Hefei, Anhui 230601, China.
initial cluster center, and it is necessary to determine the
cluster number in advance, and easy to get into the local
optimal solution. Except for these, Sparse subspace
clustering (SSC) [8
] maps the essential features of
high-dimensional data to the lower dimensional subspace.
And then, the SSC algorithm clusters data points that lie in a
union of low-dimensional subspaces. But unfortunately,
these clustering methods have an obvious disadvantage.
They have no prior information because of only unlabeled
samples used. However, the labeled samples can obtain
discriminant self-expressive coefficients. In exploiting the
subspace structure, it makes a great contribution. In this
paper, we utilize the non-negative low-rank representation
(NNLRR) [9
] algorithm, which is proposed by Fang et al. to
cluster normal data, to cluster HSIs. The method integrate the
LRR and the Gaussian fields and harmonic functions (GFHF)
[10
] into a single optimization problem, and it guided the
affinity matrix construction by the supervision information,
then accomplish the affinity matrix construction and
subspace clustering simultaneously, thus, the overall
optimum can be guaranteed.
We organize the remainder of this paper as follows. In
Section Ċ, we give some introdution of Gaussian Fields and
Harmonic Functions and LRR. Section ċ expresses the
related work of the algorithm. Some experiments and
analyses are shown in Section Č. In Section č, we draw a
conclusion about the algorithm and outline the future work.
II. R
ELATED WORKS
In this section, we introduce the low-rank representation
(LRR) [11
] algorithm firstly. Then, we introduce the
well-known semi-supervised learning method, called
Gaussian Fields and Harmonic Functions.
A. Low-Rank Representation
In a given dictionary, the LRR’s idea is to seek the
lowest rank representation among the many possible linear
combinations of the bases. We assume that the observed data
matrix אԸ
ܯൈܰ
is approximately drawn from a union of c
low-dimensional subspaces contaminated by error E. So, this
formulates the model shown in (1):
EAZYtsEZ
Z
ǂǂǂǂ
ǂǂ
..mi n
0
J
(1)
Where ൌ
ሾ
ͳ
ǡ
ʹ
ǡ ǥǡ
ሿ
is the base matrix. In a
general way, the dictionary allways choose the data matrix Y
(i.e. ൌ). And
ԡ
ή
ԡ
כ
represents the sum of the singular
values of a matrix, called nuclear norm matrix. The error
matrix is E.
ԡ
ή
ԡ
Ͳ
is the number of nonzero entries, called
the sparsity measure. As we know, we can’t optimize the
-Supervised Subspace Clustering via Non-Negative Low-
Rank
Representation for Hyperspectral Images
Jipan Yang
†
, Dexiang Zhang, Teng Li, Yan Wang, Qing Yan
1
978-1-5386-6869-6/18/$31.00 © 2018 IEEE
108
Proceedings of The 2018 IEEE International
Conference on Real-time Computing and Robotics
August 1-5, 2018, Kandima, Maldives