Title of article :
Invariant Bayesian inference in regression models that is robust against the Jeffreys–Lindleyʹs paradox
Author/Authors :
Frank Kleibergen، نويسنده , , Frank، نويسنده ,
Issue Information :
دوفصلنامه با شماره پیاپی سال 2004
Pages :
32
From page :
227
To page :
258
Abstract :
We obtain the prior and posterior probability of a nested regression model as the Hausdorff-integral of the prior and posterior on the parameters of an encompassing linear regression model over a lower-dimensional set that represents the nested model. The Hausdorff-integral is invariant and therefore avoids the Borel–Kolmogorov paradox. Basing priors and prior probabilities of nested regression models on the prior on the parameters of an encompassing linear regression model reduces the discrepancies between classical and Bayesian inference, like, the Jeffreys–Lindleyʹs paradox. We illustrate the analysis with examples of linear restrictions, i.e. a linear regression model, and non-linear restrictions, i.e. a cointegration and an autoregressive moving average model, on the parameters of an encompassing linear regression model.
Keywords :
Cointegration , ARMA , Conditional densities , Prior and posterior (odds ratios) , Borel–Kolmogorov paradox
Journal title :
Journal of Econometrics
Serial Year :
2004
Journal title :
Journal of Econometrics
Record number :
1558634
Link To Document :
بازگشت