Title :
On the relation of reachability to minimum cost optimal control
Author_Institution :
Dept. of Eng., Cambridge Univ., UK
Abstract :
Questions of reachability for continuous and hybrid systems can be formulated as optimal control or game theory problems, whose solution can be characterised using variants of the Hamilton-Jacobi-Bellman or Isaacs partial differential equations. This paper establishes a link between reachability and invariance problems and viscosity solutions of a Hamilton-Jacobi partial differential equation, developed to address optimal control problems where the cost function is the minimum of a function of the state over a given horizon. The form of the resulting partial differential equation (continuity of the Hamiltonian and simple boundary conditions) makes this approach especially attractive from the point of view of numerical computation.
Keywords :
cost optimal control; dynamic programming; partial differential equations; Hamilton-Jacobi-Bellman partial differential equations; Isaacs partial differential equations; dynamic programming; game theory; invariance problems; minimum cost optimal control; reachability relation; Biological control systems; Biology computing; Boundary conditions; Control systems; Cost function; Differential equations; Dynamic programming; Optimal control; Partial differential equations; Viscosity;
Conference_Titel :
Decision and Control, 2002, Proceedings of the 41st IEEE Conference on
Print_ISBN :
0-7803-7516-5
DOI :
10.1109/CDC.2002.1184805