Title :
Supervised and unsupervised learning in radial basis function classifiers
Author :
Tarassenko, L. ; Roberts, S.
Author_Institution :
Dept. of Eng. Sci., Oxford Univ., UK
fDate :
8/1/1994 12:00:00 AM
Abstract :
The paper considers a number of strategies for training radial basis function (RBF) classifiers. A benchmark problem is constructed using ten-dimensional input patterns which have to be classified into one of three classes. The RBF networks are trained using a two-phase approach (unsupervised clustering for the first layer followed by supervised learning for the second layer), error backpropagation (supervised learning for both layers) and a hybrid approach. It is shown that RBF classifiers trained with error backpropagation give results almost identical to those obtained with a multilayer perceptron. Although networks trained with the two-phase approach give slightly worse classification results, it is argued that the hidden-layer representation of such networks is much more powerful, especially if it is encoded in the form of a Gaussian mixture model. During training, the number of subclusters present within the training database can be estimated: during testing, the activities in the hidden layer of the classification network can be used to assess the novelty of input patterns and thereby help to validate network outputs
Keywords :
backpropagation; feedforward neural nets; pattern recognition; unsupervised learning; Gaussian mixture model; RBF networks; classification networ; error backpropagation; hidden-layer representation; hybrid approach; multilayer perceptron; network outputs; radial basis function classifiers; supervised learning; ten-dimensional input patterns; training; training database; unsupervised clustering; unsupervised learning;
Journal_Title :
Vision, Image and Signal Processing, IEE Proceedings -
DOI :
10.1049/ip-vis:19941324