DocumentCode :
1405665
Title :
Voronoi networks and their probability of misclassification
Author :
Krishna, K. ; Thathachar, M. A L ; Ramakrishnan, K.R.
Volume :
11
Issue :
6
fYear :
2000
fDate :
11/1/2000 12:00:00 AM
Firstpage :
1361
Lastpage :
1372
Abstract :
To reduce the memory requirements and the computation cost, many algorithms have been developed that perform nearest neighbor classification using only a small number of representative samples obtained from the training set. We call the classification model underlying all these algorithms as Voronoi networks (Vnets). We analyze the generalization capabilities of these networks by bounding the generalization error. The class of problems that can be solved by Vnets is characterized by the extent to which the set of points on the decision boundaries fill the feature space. We show that Vnets asymptotically converge to the Bayes classifier with arbitrarily high probability provided the number of representative samples grow slower than the square root of the number of training samples and also give the optimal growth rate of the number of representative samples. We redo the analysis for decision tree (DT) classifiers and compare them with Vnets. The bias/variance dilemma and the curse of dimensionality with respect to Vnets and DTs are also discussed.
Keywords :
generalisation (artificial intelligence); learning (artificial intelligence); neural nets; pattern classification; probability; statistical analysis; Voronoi networks; decision tree; generalization; neural networks; pattern classification; probability; statistical learning theory; Classification tree analysis; Computational efficiency; Decision trees; Iterative algorithms; Nearest neighbor searches; Pattern recognition; Prototypes; Random access memory; Statistical learning; Testing;
fLanguage :
English
Journal_Title :
Neural Networks, IEEE Transactions on
Publisher :
ieee
ISSN :
1045-9227
Type :
jour
DOI :
10.1109/72.883447
Filename :
883447
Link To Document :
بازگشت