• DocumentCode
    319981
  • Title

    Hamiltonian systems, HJB equations, and stochastic controls

  • Author

    Zhou, Xun Yu

  • Author_Institution
    Dept. of Syst. Eng. & Eng. Manage., Chinese Univ. of Hong Kong, Shatin, Hong Kong
  • Volume
    4
  • fYear
    1997
  • fDate
    10-12 Dec 1997
  • Firstpage
    3436
  • Abstract
    Pontraygin´s maximum principle (MP) involving the Hamiltonian system and Bellman´s dynamic programming (DP) involving the HJB equation are the two most important approaches in modern optimal control theory. However, these two approaches have been developed separately in literature and it has been a long-standing, yet fundamentally important problem to disclose the relationship between them and to establish a unified theory. The problem is by no means a “new” one; indeed, it roots in the Hamilton-Jacobi theory in analytic mechanics and method of characteristics in classical PDE theory, and has intrinsic relation with the Feynman-Kac formula in stochastic analysis and shadow price theory in economics. This paper discusses some deep connections between the MP and DP in stochastic optimal controls from various aspects
  • Keywords
    dynamic programming; maximum principle; partial differential equations; stochastic systems; Bellman´s dynamic programming; HJB equations; Hamiltonian systems; Pontraygin´s maximum principle; modern optimal control theory; stochastic optimal controls; Calculus; Control systems; Differential equations; Dynamic programming; Jacobian matrices; Optimal control; Research and development management; Stochastic processes; Stochastic systems; Systems engineering and theory;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Decision and Control, 1997., Proceedings of the 36th IEEE Conference on
  • Conference_Location
    San Diego, CA
  • ISSN
    0191-2216
  • Print_ISBN
    0-7803-4187-2
  • Type

    conf

  • DOI
    10.1109/CDC.1997.652379
  • Filename
    652379