DocumentCode
300686
Title
Control system analysis and design upon the Lyapunov method
Author
Lyashevskiy, Sergey ; Meyer, Andrew U.
Author_Institution
Dept. of Electr. Eng., Purdue Univ., Indianapolis, IN, USA
Volume
5
fYear
1995
fDate
21-23 Jun 1995
Firstpage
3219
Abstract
The objectives of the present paper are analysis of stability of the nonlinear continuous-time systems and application of the second method to the optimization problem. Based on Lyapunov´s method and the Hamilton-Jacobi-Bellman theory we outline procedures which can be used to unify results in stability analysis and system design. For nonlinear time-varying dynamical systems we show how the nonlinear optimization problem can be solved. Four illustrative examples are considered in details to demonstrate the advantages of the given analysis and control methodologies
Keywords
Lyapunov methods; continuous time systems; control system analysis; control system synthesis; nonlinear control systems; optimisation; stability; time-varying systems; Hamilton-Jacobi-Bellman theory; Lyapunov method; control system analysis; nonlinear continuous-time systems; optimization; stability; time-varying dynamical systems; Control system analysis; Control system synthesis; Control systems; Lyapunov method; Nonlinear equations; Optimal control; Performance analysis; Stability analysis; Sufficient conditions; System analysis and design;
fLanguage
English
Publisher
ieee
Conference_Titel
American Control Conference, Proceedings of the 1995
Conference_Location
Seattle, WA
Print_ISBN
0-7803-2445-5
Type
conf
DOI
10.1109/ACC.1995.532197
Filename
532197
Link To Document