DocumentCode :
2857424
Title :
Sequential normalized maximum likelihood in log-loss prediction
Author :
Kotlowski, Wojciech ; Grunwald, P.
Author_Institution :
Inst. of Comput. Sci., Poznan Univ. of Technol., Poznań, Poland
fYear :
2012
fDate :
3-7 Sept. 2012
Firstpage :
547
Lastpage :
551
Abstract :
The paper considers sequential prediction of individual sequences with log loss using an exponential family of distributions. We first show that the commonly used maximum likelihood strategy is suboptimal and requires an additional assumption about boundedness of the data sequence. We then show that both problems can be be addressed by adding the currently predicted outcome to the calculation of the maximum likelihood, followed by normalization of the distribution. The strategy obtained in this way is known in the literature as the sequential normalized maximum likelihood (SNML) strategy. We show that for general exponential families, the regret is bounded by the familiar (k/2)logn and thus optimal up to O(1). We also introduce an approximation to SNML, flattened maximum likelihood, much easier to compute that SNML itself, while retaining the optimal regret under some additional assumptions. We finally discuss the relationship to the Bayes strategy with Jeffreys´ prior.
Keywords :
Bayes methods; exponential distribution; maximum likelihood sequence estimation; Bayes strategy; Jeffreys prior; SNML; data sequence; exponential distributions family; flattened maximum likelihood; log-loss prediction; sequential normalized maximum likelihood; Approximation methods; Conferences; Games; Information theory; Maximum likelihood estimation; Predictive models;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Information Theory Workshop (ITW), 2012 IEEE
Conference_Location :
Lausanne
Print_ISBN :
978-1-4673-0224-1
Electronic_ISBN :
978-1-4673-0222-7
Type :
conf
DOI :
10.1109/ITW.2012.6404734
Filename :
6404734
Link To Document :
بازگشت