DocumentCode :
1507876
Title :
A One-Layer Recurrent Neural Network for Constrained Nonsmooth Optimization
Author :
Qingshan Liu ; Jun Wang
Author_Institution :
Sch. of Autom., Southeast Univ., Nanjing, China
Volume :
41
Issue :
5
fYear :
2011
Firstpage :
1323
Lastpage :
1333
Abstract :
This paper presents a novel one-layer recurrent neural network modeled by means of a differential inclusion for solving nonsmooth optimization problems, in which the number of neurons in the proposed neural network is the same as the number of decision variables of optimization problems. Compared with existing neural networks for nonsmooth optimization problems, the global convexity condition on the objective functions and constraints is relaxed, which allows the objective functions and constraints to be nonconvex. It is proven that the state variables of the proposed neural network are convergent to optimal solutions if a single design parameter in the model is larger than a derived lower bound. Numerical examples with simulation results substantiate the effectiveness and illustrate the characteristics of the proposed neural network.
Keywords :
convex programming; decision theory; mathematics computing; neural nets; constrained nonsmooth optimization; decision variables; differential inclusion; global convexity condition; neuron; objective function; one-layer recurrent neural network; Convergence; Lyapunov methods; Numerical models; Optimization; Recurrent neural networks; Trajectory; Convergence; Lyapunov function; differential inclusion; nonsmooth optimization; recurrent neural networks; Computer Simulation; Neural Networks (Computer); Nonlinear Dynamics;
fLanguage :
English
Journal_Title :
Systems, Man, and Cybernetics, Part B: Cybernetics, IEEE Transactions on
Publisher :
ieee
ISSN :
1083-4419
Type :
jour
DOI :
10.1109/TSMCB.2011.2140395
Filename :
5759759
Link To Document :
بازگشت