Title :
Qualitative analysis of a recurrent neural network for nonlinear continuously differentiable convex minimization over a nonempty closed convex subset
Author_Institution :
Dept. of Electr. & Comput. Eng., Delaware Univ., Newark, DE, USA
fDate :
11/1/2001 12:00:00 AM
Abstract :
We investigate the qualitative properties of a recurrent neural network (RNN) for minimizing a nonlinear continuously differentiable and convex objective function over any given nonempty, closed, and convex subset which may be bounded or unbounded, by exploiting some key inequalities in mathematical programming. The global existence and boundedness of the solution of the RNN are proved when the objective function is convex and has a nonempty constrained minimum set. Under the same assumption, the RNN is shown to be globally convergent in the sense that every trajectory of the RNN converges to some equilibrium point of the RNN. If the objective function itself is uniformly convex and its gradient vector is a locally Lipschitz continuous mapping, then the RNN is globally exponentially convergent in the sense that every trajectory of the RNN converges to the unique equilibrium point of the RNN exponentially. These qualitative properties of the RNN render the network model well suitable for solving the convex minimization over any given nonempty, closed, and convex subset, no matter whether the given constrained subset is bounded or not
Keywords :
convergence; mathematical programming; minimisation; recurrent neural nets; set theory; global exponential convergence; locally Lipschitz continuous mapping; mathematical programming; nonempty closed convex subset; nonempty constrained minimum set; nonlinear continuously differentiable convex minimization; qualitative analysis; qualitative properties; recurrent neural network; Artificial neural networks; Linear programming; Mathematical programming; Quadratic programming; Recurrent neural networks; Trajectory;
Journal_Title :
Neural Networks, IEEE Transactions on