Title :
Adaptive multilayer perceptrons with long- and short-term memories
Author :
Lo, James T. ; Bassu, Devasis
Author_Institution :
Dept. of Math. & Stat., Maryland Univ., Baltimore, MD, USA
fDate :
1/1/2002 12:00:00 AM
Abstract :
Multilayer perceptrons (MLPs) with long- and short-term memories (LASTMs) are proposed for adaptive processing. The activation functions of the output neurons of such a network are linear, and thus the weights in the last layer affect the outputs of the network linearly and are called linear weights. These linear weights constitute the short-term memory and other weights the long-term memory. It is proven that virtually any function f(x, θ) with an environmental parameter θ can be approximated to any accuracy by an MLP with LASTMs whose long-term memory is independent of θ. This independency of θ allows the long-term memory to be determined in an a priori training and allows the online adjustment of only the short-term memory for adapting to the environmental parameter θ. The benefits of using an MLP with LASTMs include less online computation, no poor local extrema to fall into, and much more timely and better adaptation. Numerical examples illustrate that these benefits are realized satisfactorily
Keywords :
adaptive systems; function approximation; learning (artificial intelligence); multilayer perceptrons; transfer functions; LASTMs; MLPs; a priori training; activation functions; adaptive multilayer perceptrons; adaptive processing; environmental parameter; function approximation; linear weights; local extrema; long-term memories; online computation; output neurons; short-term memories; Associate members; Backpropagation algorithms; Function approximation; Genetic algorithms; Kalman filters; Multilayer perceptrons; Neurons; Nonhomogeneous media; Optimization methods; Simulated annealing;
Journal_Title :
Neural Networks, IEEE Transactions on