Title :
Convergence analysis for a class of neural networks
Author :
Polycarpou, Marios M. ; Ioannou, Petros A.
Author_Institution :
Dept. of Electr. Eng.-Syst., Univ. of Southern California, Los Angeles, CA
Abstract :
Summary form only given, as follows. The authors consider the convergence issue that arises in the application of backpropagation algorithms in a special class of neural network architectures, referred to as structured networks, which are used for solving matrix algebra problems. They have developed bounds for the learning rate under which exponential convergence of the training procedure is shown. They also investigated methods for improving the rate of convergence. For a special class of problems, they introduced the orthogonalized backpropagation algorithm, an optimal recursive update law for minimizing a least-squares cost functional, which guarantees exact convergence in one epoch. The results make it possible to obtain valuable insight into neural network learning and to unify certain learning procedures used by connectionists and adaptive control theorists
Keywords :
convergence; learning systems; matrix algebra; neural nets; exact convergence; exponential convergence; learning; learning rate; least-squares cost functional; matrix algebra; neural networks; optimal recursive update law; orthogonalized backpropagation algorithm; rate of convergence; structured networks; training procedure; Adaptive control; Backpropagation algorithms; Convergence; Cost function; Matrices; Neural networks;
Conference_Titel :
Neural Networks, 1991., IJCNN-91-Seattle International Joint Conference on
Conference_Location :
Seattle, WA
Print_ISBN :
0-7803-0164-1
DOI :
10.1109/IJCNN.1991.155605