DocumentCode :
3376439
Title :
Learning discrete Bayesian models for autonomous agent navigation
Author :
Nikovski, Daniel ; Nourbakhsh, Illah
Author_Institution :
Robotics Inst., Carnegie Mellon Univ., Pittsburgh, PA, USA
fYear :
1999
fDate :
1999
Firstpage :
137
Lastpage :
143
Abstract :
Partially observable Markov decision processes (POMDPs) are a convenient representation for reasoning and planning in mobile robot applications. We investigate two algorithms for learning POMDPs from series of observation/action pairs by comparing their performance in fourteen synthetic worlds in conjunction with four planning algorithms. Experimental results suggest that the traditional Baum-Welch algorithm learns better the structure of worlds specifically designed to impede the agent, while a best-first model merging algorithm originally due to Stolcke and Omohundro (1993) performs better in more benign worlds, including such model of typical real-world robot fetching tasks
Keywords :
Bayes methods; Markov processes; decision theory; learning (artificial intelligence); mobile robots; navigation; path planning; Baum-Welch algorithm; Markov decision processes; autonomous agent; best-first model; discrete Bayesian models; mobile robots; navigation; path planning; Algorithm design and analysis; Autonomous agents; Bayesian methods; Impedance; Merging; Mobile robots; Navigation; Probability distribution; Robot sensing systems; Strips;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Computational Intelligence in Robotics and Automation, 1999. CIRA '99. Proceedings. 1999 IEEE International Symposium on
Conference_Location :
Monterey, CA
Print_ISBN :
0-7803-5806-6
Type :
conf
DOI :
10.1109/CIRA.1999.810011
Filename :
810011
Link To Document :
بازگشت