Title :
Autoregressive model order selection by a finite sample estimator for the Kullback-Leibler discrepancy
Author :
Broersen, P.M.T. ; Wensink, H.E.
Author_Institution :
Dept. of Appl. Phys., Delft Univ. of Technol., Netherlands
fDate :
7/1/1998 12:00:00 AM
Abstract :
The finite sample information criterion (FSIC) is introduced as an estimator for the Kullback-Leibler discrepancy of an autoregressive time series. It is derived especially for order selection in finite samples, where model orders are greater than one tenth of the sample size. It uses a theoretical expression for the ratio between the squared prediction error and the residual variance its the penalty factor for additional parameters in a model. This ratio can be found with the finite sample theory for autoregressive estimation, which is based on empirical approximations for the variance of parameters. It takes into account the different number of degrees of freedom that are available effectively in the various algorithms for autoregressive parameter estimation. The performance of FSIC has been compared with existing order selection criteria in simulation experiments using four different estimation methods. In finite samples, the FSIC selects model orders with a better objective quality for all estimation methods
Keywords :
autoregressive processes; information theory; parameter estimation; prediction theory; statistical analysis; time series; Kullback-Leibler discrepancy; autoregressive estimation; autoregressive model order selection; autoregressive time series; finite sample estimator; finite sample information criterion; finite sample theory; parameter estimation; penalty factor; residual variance; squared prediction error; Adaptive control; Adaptive filters; Adaptive signal processing; Circuit stability; Nonlinear equations; Nonlinear systems; Parameter estimation; Signal processing; Silicon compounds; Yarn;
Journal_Title :
Signal Processing, IEEE Transactions on