Title :
Nonlinear backpropagation: doing backpropagation without derivatives of the activation function
Author :
Hertz, John ; Krogh, Anders ; Lautrup, Benny ; Lehmann, Torsten
Author_Institution :
Nordita, Copenhagen, Denmark
fDate :
11/1/1997 12:00:00 AM
Abstract :
The conventional linear backpropagation algorithm is replaced by a nonlinear version, which avoids the necessity for calculating the derivative of the activation function. This may be exploited in hardware realizations of neural processors. In this paper we derive the nonlinear backpropagation algorithms in the framework of recurrent backpropagation and present some numerical simulations of feedforward networks on the NetTalk problem. A discussion of implementation in analog very large scale integration (VLSI) electronics concludes the paper
Keywords :
VLSI; analogue integrated circuits; backpropagation; feedforward neural nets; neural chips; nonlinear network synthesis; recurrent neural nets; transfer functions; NetTalk problem; activation function; analog VLSI; feedforward networks; neural processors; nonlinear backpropagation; nonlinear gradient descent; recurrent backpropagation; Backpropagation algorithms; Biomedical optical imaging; Equations; Hardware; Neural networks; Neurons; Numerical simulation; Read only memory; Recurrent neural networks; Very large scale integration;
Journal_Title :
Neural Networks, IEEE Transactions on