DocumentCode :
2480266
Title :
MDL induction, Bayesianism, and Kolmogorov complexity
Author :
Vitanyi, Paul ; Li, Ming
Author_Institution :
CWI, Amsterdam, Netherlands
fYear :
1998
fDate :
16-21 Aug 1998
Firstpage :
346
Abstract :
The relationship between the Bayesian approach and the minimum description length approach is established. We sharpen and clarify the general modeling principles MDL and MML, abstracted as the ideal MDL principle and defined from Bayes´s rule by means of Kolmogorov complexity. The basic condition under which the ideal principle should be applied is encapsulated as the fundamental inequality, which in broad terms states that the principle is valid when the data are random, relative to every contemplated hypothesis and also these hypotheses are random relative to the (universal) prior. Basically, the prior probability associated with the hypothesis should be given by the algorithmic universal probability, and the sum of the log universal probability of the model plus the log of the probability of the data given the model should be minimized. This shows that data compression is almost always the best strategy, both in hypothesis identification and prediction
Keywords :
Bayes methods; computational complexity; data compression; probability; Bayesian approach; Bayesianism; Kolmogorov complexity; MDL induction; MML; algorithmic universal probability; data compression; fundamental inequality; hypothesis identification; log universal probability; minimum description length approach; prediction; prior probability; Automatic testing; Bayesian methods; Computational modeling; Computer science; Data compression; Extrapolation; Turing machines;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Information Theory, 1998. Proceedings. 1998 IEEE International Symposium on
Conference_Location :
Cambridge, MA
Print_ISBN :
0-7803-5000-6
Type :
conf
DOI :
10.1109/ISIT.1998.708951
Filename :
708951
Link To Document :
بازگشت