DocumentCode :
1031954
Title :
Second-order neural nets for constrained optimization
Author :
Zhang, Shengwei ; Zhu, Xianing ; Zou, Li-He
Author_Institution :
Exper Vision Inc., San Jose, CA, USA
Volume :
3
Issue :
6
fYear :
1992
fDate :
11/1/1992 12:00:00 AM
Firstpage :
1021
Lastpage :
1024
Abstract :
Analog neural nets for constrained optimization are proposed as an analogue of Newton´s algorithm in numerical analysis. The neural model is globally stable and can converge to the constrained stationary points. Nonlinear neurons are introduced into the net, making it possible to solve optimization problems where the variables take discrete values, i.e., combinatorial optimization
Keywords :
mathematics computing; neural nets; numerical analysis; optimisation; combinatorial optimization; constrained optimization; constrained stationary points; neural model; nonlinear neurons; numerical analysis; second order neural nets; Constraint optimization; Differential equations; Hopfield neural networks; Lagrangian functions; Neural networks; Neurons; Numerical analysis; Simulated annealing; Subspace constraints; Very large scale integration;
fLanguage :
English
Journal_Title :
Neural Networks, IEEE Transactions on
Publisher :
ieee
ISSN :
1045-9227
Type :
jour
DOI :
10.1109/72.165605
Filename :
165605
Link To Document :
بازگشت