DocumentCode :
1133899
Title :
A self-organizing HCMAC neural-network classifier
Author :
Lee, Hahn-Ming ; Chen, Chih-Ming ; Lu, Yung-Feng
Author_Institution :
Dept. of Comput. Sci. & Inf. Eng., Nat. Taiwan Univ. of Sci. & Technol., Taiwan
Volume :
14
Issue :
1
fYear :
2003
fDate :
1/1/2003 12:00:00 AM
Firstpage :
15
Lastpage :
27
Abstract :
This paper presents a self-organizing hierarchical cerebellar model arithmetic computer (HCMAC) neural-network classifier, which contains a self-organizing input space module and an HCMAC neural network. The conventional CMAC can be viewed as a basis function network (BFN) with supervised learning, and performs well in terms of its fast learning speed and local generalization capability for approximating nonlinear functions. However, the conventional CMAC has an enormous memory requirement for resolving high-dimensional classification problems, and its performance heavily depends on the approach of input space quantization. To solve these problems, this paper presents a novel supervised HCMAC neural network capable of resolving high-dimensional classification problems well. Also, in order to reduce what is often trial-and-error parameter searching for constructing memory allocation automatically, proposed herein is a self-organizing input space module that uses Shannon´s entropy measure and the golden-section search method to appropriately determine the input space quantization according to the various distributions of training data sets. Experimental results indicate that the self-organizing HCMAC indeed has a fast learning ability and low memory requirement. It is a better performing network than the conventional CMAC for resolving high-dimensional classification problems. Furthermore, the self-organizing HCMAC classifier has a better classification ability than other compared classifiers.
Keywords :
cerebellar model arithmetic computers; generalisation (artificial intelligence); learning (artificial intelligence); pattern classification; performance evaluation; radial basis function networks; search problems; self-organising feature maps; storage allocation; Shannon entropy; basis function network; experimental results; generalization; golden-section search method; high-dimensional classification; input space quantization; learning; memory allocation; memory requirement; neural-network classifier; nonlinear function approximation; performance; self-organizing HCMAC; self-organizing hierarchical cerebellar model arithmetic computer; self-organizing input space module; supervised learning; Digital arithmetic; Entropy; Kernel; Least squares methods; Neural networks; Pattern recognition; Quantization; Search methods; Signal processing algorithms; Training data;
fLanguage :
English
Journal_Title :
Neural Networks, IEEE Transactions on
Publisher :
ieee
ISSN :
1045-9227
Type :
jour
DOI :
10.1109/TNN.2002.806607
Filename :
1176123
Link To Document :
بازگشت