DocumentCode :
1757785
Title :
The Principle of Maximum Causal Entropy for Estimating Interacting Processes
Author :
Ziebart, Brian D. ; Bagnell, J. Andrew ; Dey, Anind K.
Author_Institution :
Dept. of Comput. Sci., Univ. of Illinois at Chicago, Chicago, IL, USA
Volume :
59
Issue :
4
fYear :
2013
fDate :
41365
Firstpage :
1966
Lastpage :
1980
Abstract :
The principle of maximum entropy provides a powerful framework for estimating joint, conditional, and marginal probability distributions. However, there are many important distributions with elements of interaction and feedback where its applicability has not been established. This paper presents the principle of maximum causal entropy-an approach based on directed information theory for estimating an unknown process based on its interactions with a known process. We demonstrate the breadth of the approach using two applications: a predictive solution for inverse optimal control in decision processes and computing equilibrium strategies in sequential games.
Keywords :
game theory; maximum entropy methods; optimal control; statistical distributions; computing equilibrium strategy; conditional probability distribution; decision process; directed information theory; interacting process; inverse optimal control; joint probability distribution; marginal probability distribution; maximum causal entropy; predictive solution; process estimation; sequential game; Entropy; Estimation; Joints; Optimization; Probability distribution; Process control; Random variables; Causal entropy; correlated equilibrium (CE); directed information; inverse optimal control; inverse reinforcement learning; maximum entropy; statistical estimation;
fLanguage :
English
Journal_Title :
Information Theory, IEEE Transactions on
Publisher :
ieee
ISSN :
0018-9448
Type :
jour
DOI :
10.1109/TIT.2012.2234824
Filename :
6479340
Link To Document :
بازگشت