LS-SVMlab: a MATLAB/C toolbox for Least
Squares Support Vector Machines
Kristiaan Pelckmans, Johan A.K.Suykens,
T. Van Gestel, J. De Brabanter, L. Lukas, B. Hamers, B. De Moor and J.Vandewalle
ESAT-SCD-SISTA K.U. Leuven
Kasteelpark Arenberg 10
B-3001 Leuven-Heverlee, Belgium
kristiaan.pelckmans,johan.suykens
@esat.kuleuven.ac.be
Abstract
In this paper, a toolbox
LS-SVMlab
for Matlab with implementations for
a number of LS-SVM related algorithms is presented. The core of the
toolbox is a performant LS-SVM training and simulation environment
written in C-code. The functionality for classification, function approx-
imation and unsuperpervised learning problems as well time-series pre-
diction is explained. Extensions ofLS-SVMs towards robustness, sparse-
ness and weighted versions, as well as different techniques for tuning of
hyper-parametersare included. An implementation of a Bayesian frame-
work is made, allowing probabilistic interpretations, automatic hyper-
parameter tuning and input selection. The toolbox also contains algo-
rithms of fixed size LS-SVMs which are suitable for handling large data
sets. A recent overview on developments in the theory and algorithms of
least squares support vector machines to which this
LS-SVMlab
toolbox
is related is presented in [1].
1 Introduction
Support Vector Machines (SVM) [2, 3, 4, 5] is a powerful methodology for solving prob-
lems in nonlinear classification, function estimation and density estimation which has also
led to many other recent developments in kernel based methods in general. Originally, it
has been introduced within the context of statistical learning theory and structural risk min-
imization. In the methods one solves convex optimization problems, typically by quadratic
programming. Least Squares Support Vector Machines (LS-SVM) are re-formulations to
the standard SVMs [6, 7]. The cost function is a regularized least squares function with
equality constraints, leading to linear Karush-Kuhn-Tucker systems. The solution can be
found efficiently by iterative methods like the Conjugate Gradient (CG) algorithm [8]. LS-
SVMs are closely related to regularization networks, Gaussian processes [9] and kernel
fisher discriminant analysis[10], but additionally emphasize and exploit primal-dual in-
terpretations. Links between kernel versions of classical pattern recognition algorithms
and extensions to recurrent networks and control [11] and robust modeling [12, 13] are
available. A Bayesian evidence framework has been applied with three levels of inference
http://www.esat.kuleuven.ac.be/sista/lssvmlab