Title :
Minimax-optimal classification with dyadic decision trees
Author :
Scott, Clayton ; Nowak, Robert D.
Author_Institution :
Dept. of Stat., Rice Univ., Houston, TX, USA
fDate :
4/1/2006 12:00:00 AM
Abstract :
Decision trees are among the most popular types of classifiers, with interpretability and ease of implementation being among their chief attributes. Despite the widespread use of decision trees, theoretical analysis of their performance has only begun to emerge in recent years. In this paper, it is shown that a new family of decision trees, dyadic decision trees (DDTs), attain nearly optimal (in a minimax sense) rates of convergence for a broad range of classification problems. Furthermore, DDTs are surprisingly adaptive in three important respects: they automatically 1) adapt to favorable conditions near the Bayes decision boundary; 2) focus on data distributed on lower dimensional manifolds; and 3) reject irrelevant features. DDTs are constructed by penalized empirical risk minimization using a new data-dependent penalty and may be computed exactly with computational complexity that is nearly linear in the training sample size. DDTs comprise the first classifiers known to achieve nearly optimal rates for the diverse class of distributions studied here while also being practical and implementable. This is also the first study (of which we are aware) to consider rates for adaptation to intrinsic data dimension and relevant features.
Keywords :
Bayes methods; computational complexity; decision trees; image classification; image sampling; minimax techniques; Bayes decision boundary; DDT; computational complexity; convergence rate; data-dependent penalty; dyadic decision tree; minimax-optimal classification; training sample; Classification tree analysis; Computational complexity; Convergence; Decision trees; Minimax techniques; Noise level; Partitioning algorithms; Performance analysis; Risk management; Statistical learning; Complexity regularization; decision trees; feature rejection; generalization error bounds; manifold learning; minimax optimality; pruning; rates of convergence; recursive dyadic partitions; statistical learning theory;
Journal_Title :
Information Theory, IEEE Transactions on
DOI :
10.1109/TIT.2006.871056