PRTools Contents

PRTools User Guide



Trainable decision tree classifier



Computation of a decision tree classifier out of a dataset A using  a binary splitting criterion CRIT
INFCRIT - information gain  MAXCRIT - purity (default)  FISHCRIT - Fisher criterion

Pruning is defined by prune
PRUNE = -1 pessimistic pruning as defined by Quinlan.  PRUNE = -2 testset pruning using the dataset T, or, if not  supplied, an artificially generated testset of 5 x size of  the training set based on parzen density estimates.  see PARZENML and GENDATP PRUNE = 0 no pruning (default).  PRUNE > 0 early pruning, e.g. prune = 3 PRUNE = 10 causes heavy pruning.

If CRIT or PRUNE are set to NaN they are optimised by REGOPTC.


[1] L. Breiman, J.H. Friedman, R.A. Olshen, and C.J. Stone, Classification and regression trees, Wadsworth, California, 1984.

See also

datasets, mappings, tree_map, regoptc,

PRTools Contents

PRTools User Guide

This file has been automatically generated. If badly readable, use the help-command in Matlab.