Title :
Generating network trajectories using gradient descent in state space
Author :
Hahnloser, Richard H R
Author_Institution :
Inst. for Theor. Phys., Eidgenossische Tech. Hochschule, Zurich, Switzerland
Abstract :
A local and simple learning algorithm is introduced that gradually minimizes an error function for neural states of a general network. Unlike standard backpropagation algorithms, it is based on linearizing the neurodynamics which are interpreted as constraints for the different network variables. From the resulting equations, the weight update is deduced which has a minimal norm and produces state changes directed precisely towards target values. As an application, it is shown how to generate desired neural state space curves on recurrent Hopfield-type networks
Keywords :
Hopfield neural nets; correlation methods; learning (artificial intelligence); state-space methods; Hopfield-type networks; gradient descent method; learning algorithm; network trajectories; neurodynamics; recurrent neural networks; state space curves; tangential correlation algorithm; weight update; Backpropagation algorithms; Computer architecture; Computer networks; Intelligent networks; Neural networks; Neurodynamics; Neurons; Nonlinear equations; Physics; State-space methods;
Conference_Titel :
Neural Networks Proceedings, 1998. IEEE World Congress on Computational Intelligence. The 1998 IEEE International Joint Conference on
Conference_Location :
Anchorage, AK
Print_ISBN :
0-7803-4859-1
DOI :
10.1109/IJCNN.1998.687233