• DocumentCode
    2391933
  • Title

    Multi-UAV dynamic routing with partial observations using restless bandit allocation indices

  • Author

    Ny, Jerome Le ; Dahleh, Munther ; Feron, Eric

  • Author_Institution
    Lab. for Inf. & Decision Syst., Massachusetts Inst. of Technol., Cambridge, MA
  • fYear
    2008
  • fDate
    11-13 June 2008
  • Firstpage
    4220
  • Lastpage
    4225
  • Abstract
    Motivated by the type of missions currently performed by unmanned aerial vehicles, we investigate a discrete dynamic vehicle routing problem with a potentially large number of targets and vehicles. Each target is modeled as an independent two-state Markov chain, whose state is not observed if the target is not visited by some vehicle. The goal for the vehicles is to collect rewards obtained when they visit the targets in a particular state. This problem can be seen as a type of restless bandits problem with partial information. We compute an upper bound on the achievable performance and obtain in closed form an index policy proposed by Whittle. Simulation results provide evidence for the outstanding performance of this index heuristic and for the quality of the upper bound.
  • Keywords
    Markov processes; military aircraft; remotely operated vehicles; Markov chain; discrete dynamic vehicle routing problem; partial observation; restless bandit allocation index; unmanned aerial vehicle; Automation; Computational modeling; Costs; Military computing; Monitoring; Routing; Stochastic processes; Unmanned aerial vehicles; Upper bound; Vehicle dynamics;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    American Control Conference, 2008
  • Conference_Location
    Seattle, WA
  • ISSN
    0743-1619
  • Print_ISBN
    978-1-4244-2078-0
  • Electronic_ISBN
    0743-1619
  • Type

    conf

  • DOI
    10.1109/ACC.2008.4587156
  • Filename
    4587156