Title :
Planning and Acting in Uncertain Environments using Probabilistic Inference
Author :
Verma, Deepak ; Rao, Rajesh P N
Author_Institution :
Dept. of CSE, Washington Univ., Seattle, WA
Abstract :
An important problem in robotics is planning and selecting actions for goal-directed behavior in noisy uncertain environments. The problem is typically addressed within the framework of partially observable Markov decision processes (POMDPs). Although efficient algorithms exist for learning policies for MDPs, these algorithms do not generalize easily to POMDPs. In this paper, we propose a framework for planning and action selection based on probabilistic inference in graphical models. Unlike previous approaches based on MAP inference, our approach utilizes the most probable explanation (MPE) of variables in a graphical model, allowing tractable and efficient inference of actions. It generalizes easily to complex partially observable environments. Furthermore, it allows rewards and costs to be incorporated in a straightforward manner as part of the inference process. We investigate the application of our approach to the problem of robot navigation by testing it on a suite of well-known POMDP benchmarks. Our results demonstrate that the proposed method can beat or match the performance of recently proposed specialized POMDP solvers
Keywords :
Markov processes; mobile robots; path planning; Markov decision processes; laser range finder; most probable explanation; probabilistic inference; robot navigation; uncertain environments; Benchmark testing; Costs; Graphical models; Inference algorithms; Intelligent robots; Navigation; Postal services; Stochastic processes; Upper bound; Working environment noise;
Conference_Titel :
Intelligent Robots and Systems, 2006 IEEE/RSJ International Conference on
Conference_Location :
Beijing
Print_ISBN :
1-4244-0258-1
Electronic_ISBN :
1-4244-0259-X
DOI :
10.1109/IROS.2006.281675