Title :
Theory and development of higher-order CMAC neural networks
Author :
Lane, Stephen H. ; Handelman, David A. ; Gelfand, Jack J.
Author_Institution :
Dept. of Psychol., Princeton Univ., NJ, USA
fDate :
4/1/1992 12:00:00 AM
Abstract :
The cerebellar model articulation controller (CMAC) neural network is capable of learning nonlinear functions extremely quickly due to the local nature of its weight updating. The rectangular shape of CMAC receptive field functions, however, produces discontinuous (staircase) function approximations without inherent analytical derivatives. The ability to learn both functions and function derivatives is important for the development of many online adaptive filter, estimation, and control algorithms. It is shown that use of B-spline receptive field functions in conjunction with more general CMAC weight addressing schemes allows higher-order CMAC neural networks to be developed that can learn both functions and function derivatives. This also allows hierarchical and multilayer CMAC network architectures to be constructed that can be trained using standard error back-propagation learning techniques.<>
Keywords :
learning systems; neural nets; B-spline receptive field functions; cerebellar model articulation controller; discontinuous function approximations; error back-propagation learning techniques; higher-order CMAC neural networks; local weight updating; nonlinear function learning; receptive field functions; rectangular functions; staircase function approximations; weight addressing schemes; Biological neural networks; Filters; Function approximation; Lifting equipment; Multi-layer neural network; Neural networks; Polynomials; Shape; Spline; Training data;
Journal_Title :
Control Systems, IEEE