This article has been accepted for inclusion in a future issue of this journal. Content is final as presented, with the exception of pagination.
4 IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING
Fig. 2. Spatial noise removal mechanisms in SL-RAC and SNL-RAC.
(a) Nonlocal similar patches in HSI. (b) SL-RAC-based denoising.
(c) SNL-RAC-based denoising.
Lagrangian form
X = arg min
X
Y − X
2
F
+ μrank(X ). (2)
Equation (2) is equivalent to (1), with a proper parameter μ.
A common assump tion in subspace analysis is that the esti-
mated signals in HSI are highly correlated among the spectral
channels, whereas the noise is irrelevant because of its random
distribution.
III. P
ROPOSED ME THOD FOR HYPE RSPECTRAL
IMAGE DENOISING
In [25], it is well known that the sparse representation
and LR (SRLR)-based method achieve an excellent sparse
noise removal performance. However, the method cannot
completely remove heavy Gaussian noise, string no ise, and
mixed Poisson–Gaussian noise; this is b ecause for a heav-
ily corru pted patch (which has too much noise), its spatial
structure information may have been lost. Only using single
spatial local RAC (SL-RAC) will be difficult in recovering the
clean HSI if no auxiliary information is introduced. To solve
this problem, spatial nonlocal RAC (SNL-RAC) is exploited,
in which nonlocal similar patches are assembled into a cluster.
As shown in Fig. 2(a) and (c), the patch represented b y the
red rectangle is the to-be-reconstructed patch, and the p atches
represented b y the green rectangles are the most similar
patches found in the image. The benefits from introducing
the “cluster” concept here are twofold. First, the additio n
of similar patches will bring in extra spatial information
to help reconstruct the corrupted spatial structure in the
to-be-reconstructed patch shown by the red rectangle. Second,
the stripe noise will be more sparse over the whole group
of patches than over the noisy patch itself. In this section,
we incorporate the spatial nonlocal LR regularization into
the spectral LR-based HSI denoising model to explore the
SNL-RAC o f the HSI, as presented in Fig. 3.
A. Analysis of the Spatial Low-Rank Property of HSI
HSI patches can be constructed in a similar fashion to
conventional RGB images. We first segment HSI X into
many overlapped 3-D full-band patches (FBPs) o f the size
w × w × B and then collect these 3 -D patches as a patch
set {X
i
}
P
i=1
∈ R
w×w×B
(w < M,w < N), where the patch
number is P = (M − w + 1)(N − w + 1). Each constructed
patch contains local spatial information while preserving the
global spectra l dimensionality, which can easily help us to
consider the two important properties that underlie an HSI:
the nonlocal similarity in the spatial domain and the global
correlation across all spectral bands. By extending the nonlocal
method for HSIs, then we employ the Euclidean distance
as the measure of similarity to group similar FBPs into
clusters denoted by x
k
(k = 1, 2,...,K ). Therefore, x
k
can
be described as
x
k
= R
k
X (3)
where R
k
is the matrix to extract patch x
k
from X. Then,
we introduce a linear transform operator T that reshapes the
3-D cubic patch cluster as 2-D matrix, i.e., X
k
= T (x
k
).For
simplicity, we use the same representation, X
k
= R
k
X.For
each cluster, the version via cluster-based sparse representation
is formulated as
{D,α}=arg min
{D
k
,α
k
}
K
k
R
k
X − D
k
α
k
2
F
+ ηα
k
1
(4)
where D ={D
1
, D
2
,...,D
K
} and α ={α
1
,α
2
,...,α
K
}.
Mathematically, each cluster is composed of similar
structures, which makes it substantially belong to a lower
dimensional linear or affine subspace or manifold. Existing
the over-redundant structure and feature in cluster, i.e., texture
and detail information, each cluster can be represented reason-
ably by low-dimensional space. We analyze a general linear
representation model X
k
= D
k
α
k
,whereX
k
∈ R
w
2
B×m
is a
cluster of simlilar patches, D ∈ R
w
2
B×m
can be regarded as
a dictionary, and α
k
∈ R
m×m
is the corresponding coefficient.
Based on the properties of the rank of matrix multiplication,
we can derivate the equality as follows:
rank(D
k
α
k
) ≤ min(rank(D
k
), rank(α
k
)). (5)
Further, (5) turns to the following formula:
rank(X
k
) ≤ min(rank(D
k
), rank(α
k
)). (6)
Explicitly, if D
k
is of low rankness, X
k
will certainly meet
the LR properties, i.e., R
k
X is of low rankness in (4). However,
we do not know any prior on the dimension of the subspace.
Based on the above analysis, the LR property of R
k
X can
be characterized by the LR constraint of base space D
k
.
Therefore, during sparse representation and dictionary learn-
ing, we propose a nonlocal LR dictionary (NLRD) for spatial
low-rankness constraint, which can be formulated as
{D,α}=arg min
{D
k
,α
k
}
K
k
R
k
X − D
k
α
k
2
F
+ηα
k
1
+ λrank (D
k
)
(7)
where λ is the regularization parameter.
To comprehend NLRD model more clearly, we make a com-
parison between (4) and (7) (i.e., p revious cluster-based sparse
representation and dictionary learning for HSI denoising). The
differences between (4) and (7) consist in the approach and
unit of dictionary learning. The advantages of (7) are mainly
fivefold. First, since the cluster is composed of patches with