Abstract :
By redefining multiplier associated with inequality constraint as a positive definite function of the originally-defined multiplier, ui 2, i = 1, 2,..., m, say, the nonnegative constraints imposed on inequality constraints in Karush-Kuhn-Tucker necessary conditions are removed completely. Hence it is no longer necessary to convert inequality constraints into equality constraints by slack variables in order to reuse those results concerned only with equality constraints. Utilizing this technique, improved Lagrange nonlinear programming neural networks are devised, which handle inequality constraints directly without adding slack variables. Then the local stability of the proposed Lagrange neural networks is analyzed rigorously with Lyapunov´s first approximation principle, and its convergence is discussed deeply with LaSalle´s invariance principle. Finally, an illustrative example shows that the proposed neural networks can effectively solve the nonlinear programming problems
Keywords :
Lyapunov methods; constraint theory; invariance; neural nets; nonlinear programming; stability; Karush-Kuhn-Tucker necessary conditions; LaSalle invariance principle; Lagrange nonlinear programming neural networks; Lyapunov first approximation principle; equality constraints; inequality constraints; nonnegative constraints; positive definite function; Circuit stability; Convergence; Functional programming; Information science; Lagrangian functions; Linear programming; Neural networks; Quadratic programming; Recurrent neural networks; Stability analysis; Convergence; Lagrange Neural Network; Nonlinear Programming; Stability;