DocumentCode :
826153
Title :
Two-person zero-sum Markov games: receding horizon approach
Author :
Chang, Hyeong Soo ; Marcus, Steven I.
Author_Institution :
Dept. of Comput. Sci. & Eng., Sogang Univ., Seoul, South Korea
Volume :
48
Issue :
11
fYear :
2003
Firstpage :
1951
Lastpage :
1961
Abstract :
We consider a receding horizon approach as an approximate solution to two-person zero-sum Markov games with infinite horizon discounted cost and average cost criteria. We first present error bounds from the optimal equilibrium value of the game when both players take "correlated" receding horizon policies that are based on exact or approximate solutions of receding finite horizon subgames. Motivated by the worst-case optimal control of queueing systems by Altman, we then analyze error bounds when the minimizer plays the (approximate) receding horizon control and the maximizer plays the worst case policy. We finally discuss some state-space size independent methods to compute the value of the subgame approximately for the approximate receding horizon control, along with heuristic receding horizon policies for the minimizer.
Keywords :
Markov processes; infinite horizon; optimal control; optimisation; state-space methods; stochastic games; approximate solutions; average cost criteria; correlated receding horizon policies; error bounds; exact solutions; heuristic receding horizon policies; hindsight optimization; infinite horizon discounted cost; maximizer; minimizer; optimal equilibrium value; queueing systems; receding finite horizon subgames; receding horizon approach; state-space size independent methods; two-person zero-sum Markov games; worst-case optimal control; Communication system control; Control systems; Cost function; Error analysis; Error correction; Infinite horizon; Optimal control; Process planning; Queueing analysis; Size control;
fLanguage :
English
Journal_Title :
Automatic Control, IEEE Transactions on
Publisher :
ieee
ISSN :
0018-9286
Type :
jour
DOI :
10.1109/TAC.2003.819077
Filename :
1245183
Link To Document :
بازگشت