Title :
MDP-based mission planning for multi-UAV persistent surveillance
Author :
Byeong-Min Jeong ; Jung-Su Ha ; Han-Lim Choi
Author_Institution :
Korea Aerosp. Ind., Sacheon, South Korea
Abstract :
This paper presents a methodology to generate task flow for conducting a surveillance mission using multiple UAVs, when the goal is to persistently maintain the uncertainty level of surveillance regions as low as possible. The mission planning problem is formulated as a Markov decision process (MDP), which is a infinite-horizon discrete stochastic optimal control formulation and often leads to a periodic task flows to be implemented in a persistent manner. The method specifically focuses on reducing the size of decision space without losing key feature of the problem in order to mitigate the curse of dimensionality of MDP; integrating a task allocator to identify admissible actions is demonstrate to effectively reduce the decision space. Numerical simulations verify the applicability of the proposed decision scheme.
Keywords :
Markov processes; autonomous aerial vehicles; optimal control; path planning; surveillance; MDP-based mission planning; Markov decision process; autonomous aerial vehicles; infinite-horizon discrete stochastic optimal control formulation; multiUAV persistent surveillance; numerical simulation; surveillance regions; Surveillance; Autonomous Multi-UAV Systems; Mission planning;
Conference_Titel :
Control, Automation and Systems (ICCAS), 2014 14th International Conference on
Conference_Location :
Seoul
Print_ISBN :
978-8-9932-1506-9
DOI :
10.1109/ICCAS.2014.6987894