Title of article :
Planning in Markov Stochastic Task Domains
Author/Authors :
Yong (Yates) Lin، نويسنده , , Fillia Makedon، نويسنده ,
Issue Information :
روزنامه با شماره پیاپی سال 2010
Pages :
11
From page :
54
To page :
64
Abstract :
In decision theoretic planning, a challenge for Markov decision processes (MDPs) and partially observable Markov decision processes (POMDPs) is, many problemdomains contain big state spaces and complex tasks, which will result in poorsolution performance. We develop a task analysis and modeling (TAM) approach, in which the (PO)MDP model is separated into a task view and an action view. Inthe task view, TAM models the problem domain using a task equivalence model, with task-dependent abstract states and observations. We provide a learningalgorithm to obtain the parameter values of task equivalence models. We presentthree typical examples to explain the TAM approach. Experimental resultsindicate our approach can greatly improve the computational capacity of taskplanning in Markov stochastic domains
Keywords :
Uncertainty , decision-making , Task planning , POMDP , Markov decision processes
Journal title :
International Journal of Artificial Intelligence and Expert Systems
Serial Year :
2010
Journal title :
International Journal of Artificial Intelligence and Expert Systems
Record number :
668742
Link To Document :
بازگشت