Title of article :
Decomposition of Kullback–Leibler risk and unbiasedness for parameter-free estimators
Author/Authors :
Wu، نويسنده , , Qiang and Vos، نويسنده , , Paul، نويسنده ,
Issue Information :
روزنامه با شماره پیاپی سال 2012
Pages :
12
From page :
1525
To page :
1536
Abstract :
The bias and variance of traditional parameter estimators are parameter-dependent quantities. The maximum likelihood estimate (MLE) can be defined directly on a family of distributions P and so is parameter-free. The parameter-invariance property of the MLE can be described by the fact that the MLE for the original parameter and the MLE for any reparametrization name the same distribution. We define parameter-free estimators to be P - valued random variables rather than parameter-valued random variables. The Kullback–Leibler (KL) risk is decomposed into two parameter-free quantities that describe the variance and squared bias of the estimator. We show that for exponential families the P - valued MLE is unbiased. We define the KL mean K of a P - valued random variable and show how K describes the long-run properties of this random distribution. For most families P , the KL mean of any P - valued random variable will not lie in P so that we define another mean M, called the distribution mean that is related to K and is an element of P . By allowing the distribution estimator to take values outside of P , the KL mean can be made to lie in P . We compare the MLE to non - P - valued estimators that have been suggested for the Hardy–Weinberg model. Results for the dual KL risk are also given.
Keywords :
Distribution unbiased , KL risk , KL bias , P - variance , KL variance , P - bias , KL mean , Dual KL risk
Journal title :
Journal of Statistical Planning and Inference
Serial Year :
2012
Journal title :
Journal of Statistical Planning and Inference
Record number :
2221924
Link To Document :
بازگشت