DocumentCode :
1895676
Title :
Approximate planning with hierarchical partially observable Markov decision process models for robot navigation
Author :
Theocharous, Georgios ; Mahadevan, Sridhar
Author_Institution :
Dept. of Comput. Sci. & Eng., Michigan State Univ., East Lansing, MI, USA
Volume :
2
fYear :
2002
fDate :
2002
Firstpage :
1347
Lastpage :
1352
Abstract :
We propose and investigate a planning framework based on the hierarchical partially observable Markov decision process model (HPOMDP), and apply it to robot navigation. We show how this framework can be used to produce more robust plans as compared to flat models such as partially observable Markov decision processes (POMDPs). In our approach the environment is modeled at different levels of resolution, where abstract states represent both spatial and temporal abstraction. We test our hierarchical POMDP approach using a large simulated and real navigation environment. The results show that the robot is more successful in navigating to goals starting with no positional knowledge (uniform initial belief state distribution) using the hierarchical POMDP framework as compared to the flat POMDP approach
Keywords :
hidden Markov models; mobile robots; navigation; path planning; probability; Markov decision process models; abstract states; approximate planning; hierarchical hidden Markov models; mobile robots; navigation; probability; spatial abstraction; temporal abstraction; Animals; Computer science; Humans; Mobile robots; Navigation; Process planning; Robot sensing systems; Robustness; Spatial resolution; Testing;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Robotics and Automation, 2002. Proceedings. ICRA '02. IEEE International Conference on
Conference_Location :
Washington, DC
Print_ISBN :
0-7803-7272-7
Type :
conf
DOI :
10.1109/ROBOT.2002.1014730
Filename :
1014730
Link To Document :
بازگشت