Title :
Restless bandits with switching costs: linear programming relaxations, performance bounds and limited lookahead policies
Author :
Le Ny, Jerome ; Feron, Eric
Author_Institution :
Lab. for Inf. & Decision Syst., Massachusetts Inst. of Technol., Cambridge, MA
Abstract :
The multi-armed bandit problem and one of its most interesting extensions, the restless bandits problem, are frequently encountered in various stochastic control problems. We present a linear programming relaxation for the restless bandits problem with discounted rewards, where only one project can be activated at each period but with additional costs penalizing switching between projects. The relaxation can be efficiently computed and provides a bound on the achievable performance. We describe several heuristic policies; in particular, we show that a policy adapted from the primal-dual heuristic of Bertsimas and Nino-Mora (2000) for the classical restless bandits problem is in fact equivalent to a one-step lookahead policy; thus, the linear programming relaxation provides a means to compute an approximation of the cost-to-go. Moreover, the approximate cost-to-go is decomposable by project, and this allows the one-step lookahead policy to take the form of an index policy, which can be computed on-line very efficiently. We present numerical experiments, for which we assess the quality of the heuristics using the performance bound
Keywords :
Markov processes; linear programming; stochastic systems; Markov decision; index policy; linear programming relaxation; lookahead policy; multiarmed bandit problem; performance bounds; primal-dual heuristic; stochastic control; Aerospace engineering; Control systems; Costs; Infinite horizon; Laboratories; Linear programming; Search problems; Stochastic processes; Stochastic systems; Surveillance;
Conference_Titel :
American Control Conference, 2006
Conference_Location :
Minneapolis, MN
Print_ISBN :
1-4244-0209-3
Electronic_ISBN :
1-4244-0209-3
DOI :
10.1109/ACC.2006.1656445