Title of article :
A jackknife type approach to statistical model selection
Author/Authors :
Lee، نويسنده , , Hyunsook and Jogesh Babu، نويسنده , , G. and Rao، نويسنده , , C.R.، نويسنده ,
Issue Information :
روزنامه با شماره پیاپی سال 2012
Pages :
11
From page :
301
To page :
311
Abstract :
Procedures such as Akaike information criterion (AIC), Bayesian information criterion (BIC), minimum description length (MDL), and bootstrap information criterion have been developed in the statistical literature for model selection. Most of these methods use estimation of bias. This bias, which is inevitable in model selection problems, arises from estimating the distance between an unknown true model and an estimated model. Instead of bias estimation, a bias reduction based on jackknife type procedure is developed in this paper. The jackknife method selects a model of minimum Kullback–Leibler divergence through bias reduction. It is shown that (a) the jackknife maximum likelihood estimator is consistent, (b) the jackknife estimate of the log likelihood is asymptotically unbiased, and (c) the stochastic order of the jackknife log likelihood estimate is O ( log log n ) . Because of these properties, the jackknife information criterion is applicable to problems of choosing a model from separated families especially when the true model is unknown. Compared to popular information criteria which are only applicable to nested models such as regression and time series settings, the jackknife information criterion is more robust in terms of filtering various types of candidate models in choosing the best approximating model.
Keywords :
Jackknife , unbiased estimation , Model selection , information criterion , Maximum likelihood estimation , Kullback–Leibler divergence
Journal title :
Journal of Statistical Planning and Inference
Serial Year :
2012
Journal title :
Journal of Statistical Planning and Inference
Record number :
2221727
Link To Document :
بازگشت