DocumentCode :
1089713
Title :
Flow control using the theory of zero sum Markov games
Author :
Altman, Eitan
Author_Institution :
Centre Sophia Antipolis, INRIA, Valbonne, France
Volume :
39
Issue :
4
fYear :
1994
fDate :
4/1/1994 12:00:00 AM
Firstpage :
814
Lastpage :
818
Abstract :
Considers the problem of dynamic flow control of arriving packets into an infinite buffer. The service rate may depend on the state of the system, may change in time, and is unknown to the controller. The goal of the controller is to design an efficient policy which guarantees the best performance under the worst service conditions. The cost is composed of a holding cost, a cost of rejecting customers (packets), and a cost that depends on the quality of the service. The problem is studied in the framework of zero-sum Markov games, and a value iteration algorithm is used to solve it. It is shown that there exists an optimal stationary policy (such that the decisions depend only on the actual number of customers in the queue); it is of a threshold type, and it uses randomization in at most one state
Keywords :
Markov processes; game theory; dynamic flow control; holding cost; infinite buffer; optimal stationary policy; randomization; threshold type; value iteration algorithm; worst service conditions; zero sum Markov games; Asymptotic stability; Automatic control; Control systems; Costs; Delay; Game theory; Optimal control; Polynomials; Stability criteria; Telecommunication control;
fLanguage :
English
Journal_Title :
Automatic Control, IEEE Transactions on
Publisher :
ieee
ISSN :
0018-9286
Type :
jour
DOI :
10.1109/9.286259
Filename :
286259
Link To Document :
بازگشت