DocumentCode :
2258841
Title :
The K-Winner Machine model
Author :
Ridella, Scitidro ; Rovetta, Stefiino ; Zunino, Rodolfo
Author_Institution :
Dept. of Biophys. & Electron. Eng., Genoa Univ., Italy
Volume :
1
fYear :
2000
fDate :
2000
Firstpage :
106
Abstract :
A K-Winner Machine (KWM) selects among a family of classifiers the specific configuration that minimizes the expected generalization error. In training, KWM uses unsupervised vector quantization and subsequent calibration to label data-space partitions. At run time, KWM seeks the largest set of best-matching prototypes agreeing on a test sample, and provides a local-level measure of confidence. The VC-dim of a KWM classifier is worked out exactly; the resulting small values set tight bounds to generalization performance. The network can be applied to high-dimensional, multi-class problems with large data sets. Experimental results in both a synthetic and a real domain (NIST handwritten numerals) validate the consistency of the theoretical framework
Keywords :
generalisation (artificial intelligence); pattern classification; unsupervised learning; vector quantisation; K-Winner Machine model; NIST handwritten numerals; VC-dim; best-matching prototypes; classifiers; data-space partitions; generalization error; high-dimensional multi-class problems; local-level confidence measure; unsupervised vector quantization; Calibration; Context modeling; Electronic mail; NIST; Partitioning algorithms; Prototypes; Risk management; Runtime; Testing; Yield estimation;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 2000. IJCNN 2000, Proceedings of the IEEE-INNS-ENNS International Joint Conference on
Conference_Location :
Como
ISSN :
1098-7576
Print_ISBN :
0-7695-0619-4
Type :
conf
DOI :
10.1109/IJCNN.2000.857822
Filename :
857822
Link To Document :
بازگشت