Least Squares SVM
Least Squares Support Vector Machines are a modification of the classical Support Vector Machine, please see Suykens et. al for a complete background.
LSSVM Regression¶
In case of LSSVM regression one solves (by applying the KKT conditions) the following constrained optimization problem.
Leading to a predictive model of the form.
Where the values \alpha \ \& \ b are the solution of
Here K is the N \times N kernel matrix whose entries are given by K_{kl} = \varphi(x_k)^\intercal\varphi(x_l), \ \ k,l = 1, \cdots, N and I is the identity matrix of order N.
LSSVM Classification¶
In case of LSSVM for binary classification one solves (by applying the KKT conditions) the following constrained optimization problem.
Leading to a classifier of the form.
Where the values \alpha \ \& \ b are the solution of
Here \Omega is the N \times N matrix whose entries are given by
and I is the identity matrix of order N.
// Create the training data set
val data: Stream[(DenseVector[Double], Double)] = ...
val numPoints = data.length
val num_features = data.head._1.length
// Create an implicit vector field for the creation of the stationary
// radial basis function kernel
implicit val field = VectorField(num_features)
val kern = new RBFKernel(2.0)
//Create the model
val lssvmModel = new DLSSVM(data, numPoints, kern, modelTask = "regression")
//Set the regularization parameter and learn the model
model.setRegParam(1.5).learn()