Research Article
Face Recognition Using Double Sparse Local Fisher
Discriminant Analysis
Zhan Wang,
1,2
Qiuqi Ruan,
1,2
and Gaoyun An
1,2
1
Institute of Information Science, Beijing Jiaotong University, Beijing 100044, China
2
Beijing Key Laboratory of Advanced Information Science and Network Technology, Beijing 100044, China
Correspondence should be addressed to Zhan Wang; wz6308@hotmail.com
Received 17 October 2014; Revised 4 March 2015; Accepted 9 March 2015
Academic Editor: Zhan Shu
Copyright © 2015 Zhan Wang et al. is is an open access article distributed under the Creative Commons Attribution License,
which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Local Fisher discriminant analysis (LFDA) was proposed for dealing with the multimodal problem. It not only combines the idea
of locality preserving projections (LPP) for preserving the local structure of the high-dimensional data but also combines the idea
of Fisher discriminant analysis (FDA) for obtaining the discriminant power. However, LFDA also suers from the undersampled
problem as well as many dimensionality reduction methods. Meanwhile, the projection matrix is not sparse. In this paper, we
propose double sparse local Fisher discriminant analysis (DSLFDA) for face recognition. e proposed method rstly constructs
a sparse and data-adaptive graph with nonnegative constraint. en, DSLFDA reformulates the objective function as a regression-
type optimization problem. e undersampled problem is avoided naturally and the sparse solution can be obtained by adding the
regression-type problem to a
1
penalty. Experiments on Yale, ORL, and CMU PIE face databases are implemented to demonstrate
the eectiveness of the proposed method.
1. Introduction
Dimensionality reduction tries to transform the high-dimen-
sional data into lower-dimensional space in order to pre-
servetheusefulinformationasmuchaspossible.Ithasa
wide range of applications in pattern recognition, machine
learning, and computer vision. A well-known approach for
supervised dimensionality reduction is linear discriminant
analysis (LDA) [1]. It tries to nd a projection transformation
by maximizing the between-class distance and minimizing
the within-class distance simultaneously. In practical applica-
tions, LDA usually suers from some limitations. First, LDA
usually suers from the undersampled problem [2]; that is,
the dimension of data is larger than the number of training
samples. Second, LDA can only uncover the global Euclidean
structure. ird, the solution of LDA is not sparse, which
cannot give the physical interpretation.
To deal with the rst problem, many methods have been
proposed.Belhumeuretal.[3]proposedatwo-stageprincipal
component analysis (PCA) [4]+LDAmethod,whichutilizes
PCA to reduce dimensionality so as to make the within-class
scatter matrix nonsingular, followed by LDA for recognition.
However, some useful information may be compromised in
the PCA stage. Chen et al. [5] extracted the most discrimi-
nant information from the null space of within-class scatter
matrix. However, the discriminant information in the non-
nullspaceofwithin-classscattermatrixwouldbediscarded.
Huang et al. [6] proposed an ecient null-space approach,
which rst removes the null space of total scatter matrix. is
method is based on the observation that the null space of total
scatter matrix is the intersection of the null space of between-
class scatter matrix and the null space of within-class scatter
matrix. Qin et al. [7]proposedageneralizednullspaceuncor-
related Fisher discriminant analysis technique that integrates
the uncorrelated discriminant analysis and weighted pairwise
Fisher criterion for solving the undersampled problem. Yu
and Yang [8] proposed direct LDA (DLDA) to overcome the
undersampled problem. It removes the null space of between-
class scatter matrix and extracts the discriminant information
that corresponds to the smallest eigenvalues of the within-
class scatter matrix. Zhang et al. [9] proposed an exponential
discriminant analysis (EDA) method to extract the most
Hindawi Publishing Corporation
Mathematical Problems in Engineering
Volume 2015, Article ID 636928, 9 pages
http://dx.doi.org/10.1155/2015/636928