PRTools Contents

PRTools User Guide



Trainable classifier: Support Vector Machine, nu-algorithme

    [W,J,NU] = A*NUSVC([],KERNEL,NU)

 A Dataset
 KERNEL Untrained mapping to compute kernel by A*(A*KERNEL) during  training, or B*(A*KERNEL) during testing with dataset B.
-  String to compute kernel matrices by FEVAL(KERNEL,B,A) Default: linear kernel (PROXM([],'p',1));
 NU Regularisation parameter (0 < NU < 1): expected fraction of SV (optional; default: max(leave-one-out 1_NN error,0.01))

 W Mapping: Support Vector Classifier
 J Object indices of support objects
 NU Actual nu_value used


Optimises a support vector classifier for the dataset A by quadratic  programming. The difference with the standard SVC routine is the use and  interpretation of the regularisation parameter NU. It is an upperbound  for the expected classification error. By default NU is estimated by the  leave-one-error of the 1_NN rule. For NU = NaN an automatic optimisation  is performed using REGOPTC.

If KERNEL = 0 it is assumed that A is already the kernelmatrix (square).  In this case also a kernel matrix B should be supplied at evaluation by  B*W or PRMAP(B,W).

There are several ways to define KERNEL, e.g. PROXM([],'r',1) for a  radial basis kernel or by USERKERNEL for a user defined kernel.

SVC is basically a two-class classifier. Multi-class problems are solved  in a one-against-rest fashion by MCLASSC. The resulting base-classifiers  are combined by the maximum confidence rule. A better, non-linear  combiner might be QDC, e.g. W = A*(SVC*QDC([],[],1e-6))

See also

mappings, datasets, svc, nusvo, proxm, userkernel, regoptc, mclassc, qdc,

PRTools Contents

PRTools User Guide

This file has been automatically generated. If badly readable, use the help-command in Matlab.