Title :
Temporal backpropagation for FIR neural networks
Abstract :
The traditional feedforward neural network is a static structure which simply maps input to output. To better reflect the dynamics in a biological system, a network structure which models each synapse by a finite-impulse response (FIR) linear filter is proposed. An efficient-gradient descent algorithm which is shown to be a temporal generalization of the familiar backpropagation algorithm is derived. By modeling each synapse as a linear filter, the neural network as a whole may be thought of as an adaptive system with its own internal dynamics. Equivalently, one may think of the network as a complex nonlinear filter. Applications should thus include areas of pattern recognition where there is an inherent temporal quality to the data, such as in speech recognition. The networks should also find a natural use in areas of nonlinear control, and other adaptive signal processing and filtering applications such as noise cancellation or equalization
Keywords :
digital filters; learning systems; neural nets; temporal logic; adaptive signal processing; adaptive system; efficient-gradient descent algorithm; feedforward neural network; finite-impulse response; nonlinear filter; pattern recognition; speech recognition; supervised learning; synapse; temporal generalization;
Conference_Titel :
Neural Networks, 1990., 1990 IJCNN International Joint Conference on
Conference_Location :
San Diego, CA, USA
DOI :
10.1109/IJCNN.1990.137629