Title :
CMAS: an associative neural network alternative to backpropagation
Author :
Miller, W. Thomas, III ; Glanz, Filson H. ; Kraft, L. Gordon, III
Author_Institution :
Dept. of Electr. & Comput. Eng., New Hampshire Univ., Durham, NH, USA
fDate :
10/1/1990 12:00:00 AM
Abstract :
The CMAC (cerebellar model arithmetic computer) neural network, an alternative to backpropagated multilayer networks, is described. The following advantages of CMAC are discussed: local generalization, rapid algorithmic computation based on LMS (least-mean-square) training, incremental training, functional representation, output superposition, and a fast practical hardware realization. A geometrical explanation of how CMAC works is provided, and applications in robot control, pattern recognition, and signal processing are briefly described. Possible disadvantages of CMAC are that it does not have global generalization and that it can have noise due to hash coding. Care must be exercised (as with all neural networks) to assure that a low error solution will be learned
Keywords :
learning systems; neural nets; CMAC; associative neural network; cerebellar model arithmetic computer; functional representation; hash coding; incremental training; learning systems; least mean square training; least-mean-square; output superposition; pattern recognition; robot control; signal processing; Application software; Backpropagation algorithms; Computer networks; Digital arithmetic; Hardware; Least squares approximation; Multi-layer neural network; Neural networks; Quantum computing; Signal processing algorithms;
Journal_Title :
Proceedings of the IEEE