Title of article :
On Bayesian learning from Bernoulli observations
Author/Authors :
M Bissiri، نويسنده , , Pier Giovanni and Walker، نويسنده , , Stephen G.، نويسنده ,
Issue Information :
روزنامه با شماره پیاپی سال 2010
Abstract :
We provide a reason for Bayesian updating, in the Bernoulli case, even when it is assumed that observations are independent and identically distributed with a fixed but unknown parameter θ 0 . The motivation relies on the use of loss functions and asymptotics. Such a justification is important due to the recent interest and focus on Bayesian consistency which indeed assumes that the observations are independent and identically distributed rather than being conditionally independent with joint distribution depending on the choice of prior.
Keywords :
Kullback–Leibler divergence , Loss function , Asymptotics
Journal title :
Journal of Statistical Planning and Inference
Journal title :
Journal of Statistical Planning and Inference