DocumentCode :
1442104
Title :
A general methodology for designing globally convergent optimization neural networks
Author :
Xia, Youshen ; Wang, Jun
Author_Institution :
Dept. of Mech. & Autom. Eng., Chinese Univ. of Hong Kong, Shatin, Hong Kong
Volume :
9
Issue :
6
fYear :
1998
fDate :
11/1/1998 12:00:00 AM
Firstpage :
1331
Lastpage :
1343
Abstract :
We present a general methodology for designing optimization neural networks. We prove that the neural networks constructed by using the proposed method are guaranteed to be globally convergent to solutions of problems with bounded or unbounded solution sets, in contrast with the gradient methods whose convergence is not guaranteed. We show that the proposed method contains both the gradient methods and nongradient methods employed in existing optimization neural networks as special cases. Based on the theoretical results of the proposed method, we study the convergence and stability of general gradient models in the case of unisolated solutions. Using the proposed method, we derive some new neural network models for a very large class of optimization problems, in which the equilibrium points correspond to exact solutions and there is no variable parameter. Finally, some numerical examples show the effectiveness of the method
Keywords :
convergence; gradient methods; optimisation; recurrent neural nets; general design methodology; globally convergent optimization neural networks; gradient methods; nongradient methods; Application software; Computer networks; Design methodology; Design optimization; Function approximation; Gradient methods; Linear programming; Neural networks; Optimization methods; Signal processing algorithms;
fLanguage :
English
Journal_Title :
Neural Networks, IEEE Transactions on
Publisher :
ieee
ISSN :
1045-9227
Type :
jour
DOI :
10.1109/72.728383
Filename :
728383
Link To Document :
بازگشت