Title :
Control of systems with jump Markov disturbances
Author_Institution :
University of Kentucky, Lexington, Kentucky, USA
fDate :
4/1/1975 12:00:00 AM
Abstract :
Abstract-Control of stochastic differential equations of the form dot{x}=f^{r(t)}(t,x,u) in which r(t) is a fiie-state Markov p n m s is discussed Dynamic programming optimalityconditions are shown to be necessary and sufficient for oplimality. A stochastic minimom principle whose adjoints satisfy deterministic integral equations is defiied and shorn to be necessary and snffiaent for optimality.
Keywords :
Jump processes; Markov processes; Nonlinear systems, stochastic continuous-time; Optimal stochastic control; Stochastic optimal control; Control systems; Differential equations; Dynamic programming; Feedback control; Integral equations; Markov processes; Mathematics; Optimal control; Stochastic processes; Stochastic systems;
Journal_Title :
Automatic Control, IEEE Transactions on
DOI :
10.1109/TAC.1975.1100943