4858 IEEE TRANSACTIONS ON IMAGE PROCESSING, VOL. 21, NO. 12, DECEMBER 2012
Direct Discriminant Locality Preserving Projection
With Hammerstein Polynomial Expansion
Xi Chen, Jiashu Zhang, and Defang Li
Abstract—Discriminant locality preserving projection (DLPP)
is a linear approach that encodes discriminant information into
the objective of locality preserving projection and improves
its classification ability. To enhance the nonlinear description
ability of DLPP, we can optimize the objective function of DLPP
in reproducing kernel Hilbert space to form a kernel-based
discriminant locality preserving projection (KDLPP). However,
KDLPP suffers the following problems: 1) larger computational
burden; 2) no explicit mapping functions in KDLPP, which results
in more computational burden when projecting a new sample
into the low-dimensional subspace; and 3) KDLPP cannot obtain
optimal discriminant vectors, which exceedingly optimize the
objective of DLPP. To overcome the weaknesses of KDLPP, in this
paper, a direct discriminant locality preserving projection with
Hammerstein polynomial expansion (HPDDLPP) is proposed.
The proposed HPDDLPP directly implements the objective of
DLPP in high-dimensional second-order Hammerstein polyno-
mial space without matrix inverse, which extracts the optimal
discriminant vectors for DLPP without larger computational
burden. Compared with some other related classical methods,
experimental results for face and palmprint recognition problems
indicate the effectiveness of the proposed HPDDLPP.
Index Terms— Direct discriminant locality preserving
projection, face and palmprint recognition, Hammerstein
polynomial expansion.
I. INTRODUCTION
D
IMENSIONALITY reduction (DR) techniques have been
extensively studied for biometric feature extraction by
Manuscript received November 27, 2010; revised March 14, 2012; accepted
August 31, 2012. Date of publication September 18, 2012; date of current
version November 14, 2012. This work was supported in part by the National
Science Foundation of China under Grant 60971104 and Grant 61262040,
the Research Fund for the Doctoral Program of Higher Education of China
under Grant 20090184110008, the Sichuan Youth Science and Technology
Foundation under Grant 09ZQ026-091, the Applied Basic Research Projects
of Yunnan Province under Grant KKSY201203062, the Science and Tech-
nology Key Plan Project of Chengdu under Grant 10GGYB649GX-023,
the Fundamental Research Funds for the Central Universities under Grant
SWJTU09ZT16, and the Open Research Foundation of Chongqing Key
Lab of Signal and Information Processing, Chongqing University of Posts
and Telecommunications, under Grant CQSIP-2009-01. The associate editor
coordinating the review of this manuscript and approving it for publication was
Prof. Thrasyvoulos N. Pappas.
X. Chen is with the School of Information Engineering and Automation,
Kunming University of Science and Technology, Kunming 650093, China, and
also with the Si-Chuan province Key Laboratory of Signal and Information
Processing, Southwest Jiaotong University, Chengdu 610031, China (e-mail:
biometrics@yeah.net).
J. Zhang is with the Sichuan province Key Laboratory of Signal and
Information Processing, Southwest Jiaotong University, Chengdu 610031,
China (e-mail: jszhang@home.swjtu.edu.cn).
D. Li is with the Psychological Research and Consulting Center,
Southwest Jiaotong University, Chengdu 610031, China (e-mail: ldf125@
home.swjtu.edu.cn).
Color versions of one or more of the figures in this paper are available
online at http://ieeexplore.ieee.org.
Digital Object Identifier 10.1109/TIP.2012.2219542
transforming high dimensional biometric data into meaning-
ful low dimensional representation. Such techniques can be
mainly divided into two classes: linear and nonlinear tech-
niques. Principal Component Analysis (PCA) [1] and Linear
Discriminant Analysis (LDA) [2] are two famous linear tech-
niques which have been thoroughly studied and widely applied
to many fields such as pattern recognition, computer vision and
biometrics. However, the world is not always flat, linear DR
can not adequately explore the nonlinear structure of world [3].
A number of nonlinear DR techniques have been developed
to address the problem. One class of classical nonlinear
DR techniques are kernel based techniques, such as kernel
PCA (KPCA) [4] and kernel LDA (KLDA) [5] which have
been proven to be effective in many real-world applications.
Another class of nonlinear DR techniques is manifold based
techniques which find the intrinsic low dimensional nonlinear
structure of data hidden in the high dimensional observation
space. Isometric feature mapping (ISOMAP) [6], Local Linear
Embedding (LLE) [7], Laplacian Eigenmaps [8] and Local
Tangent Space Alignment (LTSA) [9] are classical manifold
based techniques. Experiments on artificial data have validated
their superior performance by discovering the low dimensional
essential structure of high dimensional manifold.
However, limitation of the above mentioned manifold based
nonlinear DR techniques is that an explicit mapping function
does not exist, i.e., the so called out-of-sample extension
problem. Many researchers have worked out many signifi-
cant schemes to solve such problem. Linearizing nonlinear
manifold techniques are extensively explored and wonderful
achievements have been obtained [10], [11]. Locality preserv-
ing projection (LPP) [12] is a linear approximation of the
nonlinear Laplacian Eigenmaps [8]. By solving a generalized
eigenvalue problem, LPP can get a mapping matrix which can
be used to project high dimensional data to low dimensional
subspace. LPP discovers the nonlinear characteristic of data
by preserving the local structure, which results in successful
application in face recognition [13]. However, the objective
function of LPP only emphasizes the local structure and
ignores the global structure [14], [15]. Besides, classification
has not been stressed in LPP. Discriminant locality preserving
projection (DLPP) [16] encodes discriminant information into
the objective of locality preserving projection and can obtain
better classification performance than LPP. Constrained Graph
Embedding (CGE) [17] partly utilities label information of
data to enhance the discriminant ability of graph embed-
ding based dimensionality reduction techniques. To enhance
the nonlinear description ability of LPP, Jian Cheng et al.,
proposed supervised kernel locality preserving projections
(SKLPP) [18] by optimizing the objective function of locality
1057–7149/$31.00 © 2012 IEEE