Trainable classifier: Support Vector Machine, nu-algorithme
[W,J,NU] = NUSVC(A,KERNEL,NU)
Optimises a support vector classifier for the dataset A by quadratic programming. The difference with the standard SVC routine is the use and interpretation of the regularisation parameter NU. It is an upperbound for the expected classification error. By default NU is estimated by the leave-one-error of the 1_NN rule. For NU = NaN an automatic optimisation is performed using REGOPTC.
If KERNEL = 0 it is assumed that A is already the kernelmatrix (square). In this case also a kernel matrix B should be supplied at evaluation by B*W or PRMAP(B,W).
There are several ways to define KERNEL, e.g. PROXM(,'r',1) for a radial basis kernel or by USERKERNEL for a user defined kernel.
SVC is basically a two-class classifier. Multi-class problems are solved in a one-against-rest fashion by MCLASSC. The resulting base-classifiers are combined by the maximum confidence rule. A better, non-linear combiner might be QDC, e.g. W = A*(SVC*QDC(,,1e-6))