Title :
Hamiltonian systems, HJB equations, and stochastic controls
Author_Institution :
Dept. of Syst. Eng. & Eng. Manage., Chinese Univ. of Hong Kong, Shatin, Hong Kong
Abstract :
Pontraygin´s maximum principle (MP) involving the Hamiltonian system and Bellman´s dynamic programming (DP) involving the HJB equation are the two most important approaches in modern optimal control theory. However, these two approaches have been developed separately in literature and it has been a long-standing, yet fundamentally important problem to disclose the relationship between them and to establish a unified theory. The problem is by no means a “new” one; indeed, it roots in the Hamilton-Jacobi theory in analytic mechanics and method of characteristics in classical PDE theory, and has intrinsic relation with the Feynman-Kac formula in stochastic analysis and shadow price theory in economics. This paper discusses some deep connections between the MP and DP in stochastic optimal controls from various aspects
Keywords :
dynamic programming; maximum principle; partial differential equations; stochastic systems; Bellman´s dynamic programming; HJB equations; Hamiltonian systems; Pontraygin´s maximum principle; modern optimal control theory; stochastic optimal controls; Calculus; Control systems; Differential equations; Dynamic programming; Jacobian matrices; Optimal control; Research and development management; Stochastic processes; Stochastic systems; Systems engineering and theory;
Conference_Titel :
Decision and Control, 1997., Proceedings of the 36th IEEE Conference on
Conference_Location :
San Diego, CA
Print_ISBN :
0-7803-4187-2
DOI :
10.1109/CDC.1997.652379