Block Recursive Least Squares Dictionary Learning Algorithm
Qianru Jiang
1
, Sheng Li
1
, Zeru Lu
1
and Binbin Sun
1
1. Zhejiang Provincial Key Laboratory for Signal Processing
College of Information Engineering, Zhejiang University of Technology, Hangzhou 310023, P.R. China
E-mail: qrjiang1989@zjut.edu.cn
Abstract: The block recursive least square (BRLS) dictionary learning algorithm that dealing with training data arranged
in block is proposed in this paper. BRLS can be used to update overcomplete dictionary for sparse signal representation.
Different from traditional recursive least square algorithms, BRLS is designed for data in a block form and the recursion
is developed without using the matrix inversion lemma. The proposed algorithm is applied in synthetic data and real
image reconstruction. Simulation results show that the new algorithm achieves a better performance than traditional
approaches.
Key Words: Overcomplete dictionary, recursive least square, block, sparse representation.
1 Introduction
Sparse representation of signals is a popular research area
in signal processing community. Dictionary learning is the
core issue of sparse representation which can be applied
in compression, super-resolution, classification, denoising,
inpainting and source separation [1]-[3]. Over-complete
dictionary D ∈ ℜ
N×K
with N < K can be used to represent
a signal x ∈ℜ
N×1
by linear combination of a small number
of the columns as follows
x = Dα =
K
∑
j=1
d
j
α
j
= d
1
α
1
+ d
2
α
2
+ ···+d
K
α
K
, (1)
where d
j
is an atom (one column) of the dictionary. The
vector α is the sparse coefficient of x under D, where α
j
is
its jth entry and only s(s K) number of them are non-
zero.
The objective of optimal dictionary design is to make the
total error between the sparse representation and their cor-
responding training set X ∈ ℜ
N×L
as small as possible.
Such an optimization problem can be expressed as
min
D,A
X −DA
2
F
, (2)
where X −DA
2
F
is the approximation or the representa-
tion error, A =[α
1
,α
2
,···,α
L
] is the sparse matrix, α
l
is the
sparse coefficient of the l-th training data, ·
F
denotes the
Frobenius norm. A possible strategy to optimize the sparse
matrix and the dictionary is alternately and iteratively up-
date them. For each iteration, the strategy can be divided
into two stages:
• Stage I: Sparse representation
A = argmin
A
X −DA
2
F
, s.t.α
l
0
≤ s, (3)
This work was supported by the Grants of NSFCs 61273195,
61473262 & 61503339, and ZJNSFs LY13F010009 & LQ14F030008.
with α
l
0
denoting the number of nonzero elements
in α
l
. This optimization problem can be solved with
greedy algorithms [4] such as the matching pursuit
(MP), basic matching pursuit (BMP) [5] and orthogo-
nal matching pursuit (OMP) [6]. The OMP algorithm
that can find a good balance between the complexity
and accuracy is employed in this paper.
• Stage II: Dictionary update
D = argmin
D
X −DA
2
F
,
s.t. ∀j ∈ [1:K] d
j
2
= 1.
(4)
Several traditional methods have been proposed to
solve the problem of (4), such as MOD [7], K-means,
KSVD [8]. The least square (LS) based methods are
focused in this work. The LS solution for (4) is pro-
vided in MOD algorithm, the result is straightforward
and simple. Adaptive dictionary learning algorithm
based on the recursive least square (RLS) algorithm is
proposed in [11] where the training data is arranged in
a vector by vector framework. It has been pointed out
in [11] that the adaptive dictionary learning approach
can provide a better dictionary with improved conver-
gence properties. In addition, the adaptive approach
can be used in the scenarios where the training set is
very large.
In this paper, an adaptive dictionary learning algorithm
named BRLS (Block Recursive Least Square) is developed
with the training data arranged in a block by block frame-
work. The proposed BRLS algorithm enjoys the advan-
tages of the adaptive algorithms and the block-based struc-
ture can further improve the reconstruction performance.
The main contributions of this work are three folds:
1: The training set is arranged in a block by block version,
which is more suitable for the applications such as image
processing [9], [10].
1961
978-1-4673-9714-8/16/$31.00
c
2016 IEEE