Title :
Trajectory following optimization by gradient transformation differential equations
Author :
Grantham, Walter I.
Author_Institution :
Sch. of Mech. & Mater. Eng., Washington State Univ., Pullman, WA, USA
Abstract :
For minimizing a scalar-valued function, we develop and investigate a family of gradient transformation differential equation algorithms. This family includes, as special cases: steepest descent, Newton\´s method, Levenberg-Marquardt, and a gradient enhanced Newton algorithm that we develop. Using Rosenbrock\´s "banana" function we study the stiffness of the gradient transformation family in terms of Lyapunov exponent time histories. For the example function, Newton\´s method and the Levenberg-Marquardt modification do not yield global asymptotic stability, whereas steepest descent does. However, Newton\´s method (from an initial point where it does work) is not stiff and is approximately 100 times as fast as steepest descent. In contrast, the gradient enhanced Newton method is globally convergent, is not stiff, and is approximately 25 times faster than Newton\´s method and approximately 2500 times faster than steepest descent.
Keywords :
Lyapunov methods; Newton method; asymptotic stability; differential equations; gradient methods; optimisation; Levenberg Marquardt method; Lyapunov exponent; Newtons method; Rosenbrock´s function; differential equations; global asymptotic stability; gradient enhanced Newton algorithm; gradient transformation; scalar valued function; steepest descent; trajectory following optimization; Algorithm design and analysis; Control systems; Design optimization; Differential equations; History; Linear systems; Newton method; Nonlinear systems; Optimal control; Testing;
Conference_Titel :
Decision and Control, 2003. Proceedings. 42nd IEEE Conference on
Print_ISBN :
0-7803-7924-1
DOI :
10.1109/CDC.2003.1272512