Trainable classifier: Support Vector Machine
[W,J] = SVC(A,KERNEL,C)
Optimises a support vector classifier for the dataset A by quadratic programming. The non-linearity is determined by the kernel. If KERNEL = 0 it is assumed that A is already the kernelmatrix (square). In this case also a kernel matrix B should be supplied at evaluation by B*W or PRMAP(B,W).
There are several ways to define KERNEL, e.g. PROXM('r',1) for a radial basis kernel or by USERKERNEL for a user defined kernel.
If C is NaN this regularisation parameter is optimised by REGOPTC.
SVC is basically a two-class classifier. Multi-class problems are solved in a one-against-rest fashion by MCLASSC. The resulting base-classifiers are combined by the maximum confidence rule. A better, non-linear combiner might be FISHERCC, e.g. W = A*(SVC*FISHERCC)
See for more possibilties SVCINFO
a = gendatb; % generate banana classes