DocumentCode :
2564592
Title :
Dynamic solution of the HJB equation and the optimal control of nonlinear systems
Author :
Sassano, M. ; Astolfi, A.
Author_Institution :
Dept. of Electr. & Electron. En gineering, Imperial Coll. London, London, UK
fYear :
2010
fDate :
15-17 Dec. 2010
Firstpage :
3271
Lastpage :
3276
Abstract :
Optimal control problems are often solved exploiting the solution of the so-called Hamilton-Jacobi-Bellman (HJB) partial differential equation, which may be, however, hard or impossible to solve in specific examples. Herein we circumvent this issue determining a dynamic solution of the HJB equation, without solving any partial differential equation. The methodology yields a dynamic control law that minimizes a cost functional defined as the sum of the original cost and an additional cost.
Keywords :
nonlinear control systems; optimal control; partial differential equations; HJB equation; Hamilton-Jacobi-Bellman; dynamic control law; nonlinear control systems; optimal control; partial differential equation; Approximation methods; Nonlinear systems; Optimal control; Oscillators; Partial differential equations; Riccati equations;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Decision and Control (CDC), 2010 49th IEEE Conference on
Conference_Location :
Atlanta, GA
ISSN :
0743-1546
Print_ISBN :
978-1-4244-7745-6
Type :
conf
DOI :
10.1109/CDC.2010.5716990
Filename :
5716990
Link To Document :
بازگشت