Title :
Dynamic solution of the HJB equation and the optimal control of nonlinear systems
Author :
Sassano, M. ; Astolfi, A.
Author_Institution :
Dept. of Electr. & Electron. En gineering, Imperial Coll. London, London, UK
Abstract :
Optimal control problems are often solved exploiting the solution of the so-called Hamilton-Jacobi-Bellman (HJB) partial differential equation, which may be, however, hard or impossible to solve in specific examples. Herein we circumvent this issue determining a dynamic solution of the HJB equation, without solving any partial differential equation. The methodology yields a dynamic control law that minimizes a cost functional defined as the sum of the original cost and an additional cost.
Keywords :
nonlinear control systems; optimal control; partial differential equations; HJB equation; Hamilton-Jacobi-Bellman; dynamic control law; nonlinear control systems; optimal control; partial differential equation; Approximation methods; Nonlinear systems; Optimal control; Oscillators; Partial differential equations; Riccati equations;
Conference_Titel :
Decision and Control (CDC), 2010 49th IEEE Conference on
Conference_Location :
Atlanta, GA
Print_ISBN :
978-1-4244-7745-6
DOI :
10.1109/CDC.2010.5716990