DocumentCode :
424655
Title :
The LQ control problem, for Markovian jumps linear systems with horizon defined by stopping times
Author :
Nespoli, Cristiane ; Val, João B R do ; Cáceres, Yusef
Author_Institution :
Fac. de Ciencias e Tecnologia, Univ. Estadual Paulista, Prudente, Brazil
Volume :
1
fYear :
2004
fDate :
June 30 2004-July 2 2004
Firstpage :
703
Abstract :
This paper deals with a stochastic optimal control problem involving discrete-time jump Markov linear systems. The jumps or changes between the system operation modes evolve according to an underlying Markov chain. In the model studied, the problem horizon is defined by a stopping time /spl tau/ which represents either, the occurrence of a fix number N of failures or repairs (T/sub N/), or the occurrence of a crucial failure event (/spl tau//sub /spl Delta//), after which the system is brought to a halt for maintenance. In addition, an intermediary mixed case for which /spl tau/ represents the minimum between T/sub N/ and /spl tau//sub /spl Delta// is also considered. These stopping times coincide with some of the jump times of the Markov state and the information available allows the reconfiguration of the control action at each jump time, in the form of a linear feedback gain. The solution for the linear quadratic problem with complete Markov state observation is presented. The solution is given in terms of recursions of a set of algebraic Riccati equations (ARE) or a coupled set of algebraic Riccati equation (CARE).
Keywords :
Markov processes; Riccati equations; discrete time systems; linear quadratic control; stochastic systems; LQ control problem; Markov chain; algebraic Riccati equations; discrete-time jump Markov linear system; linear feedback gain; stochastic optimal control;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
American Control Conference, 2004. Proceedings of the 2004
Conference_Location :
Boston, MA, USA
ISSN :
0743-1619
Print_ISBN :
0-7803-8335-4
Type :
conf
Filename :
1383686
Link To Document :
بازگشت