Title :
Presupervised and post-supervised prototype classifier design
Author :
Kuncheva, Ludmila I. ; Bezdek, James C.
Author_Institution :
Sch. of Math., Univ. of Wales, Bangor, UK
fDate :
9/1/1999 12:00:00 AM
Abstract :
We extend the nearest prototype classifier to a generalized nearest prototype classifier (GNPC). The GNPC uses “soft” labeling of the prototypes in the classes, thereby encompassing a variety of classifiers. Based on how the prototypes are found we distinguish between presupervised and post-supervised GNPC designs. We derive the conditions for optimality of two designs where prototypes represent: 1) the components of class-conditional mixture densities (presupervised design); or 2) the components of the unconditional mixture density (post-supervised design). An artificial data set and the “satimage” data set from the database ELENA are used to experimentally study the two approaches. A radial basis function network is used as a representative of each GNPC type. Neither the theoretical nor the experimental results indicate clear reasons to prefer one of the approaches. The post-supervised GNPC design tends to be more robust and less accurate than the presupervised one
Keywords :
learning (artificial intelligence); pattern classification; radial basis function networks; mixture modelling; post-supervised designs; presupervised designs; prototype classifier; prototype selection; radial basis function neural network; supervised learning; Databases; Error analysis; Fuzzy neural networks; Fuzzy systems; Labeling; Least squares methods; Nearest neighbor searches; Neural networks; Prototypes; Robustness;
Journal_Title :
Neural Networks, IEEE Transactions on