Title :
Minimum complexity regression estimation with weakly dependent observations
Author :
Modha, Dharmendra S. ; Masry, Elias
Author_Institution :
Dept. of Electr. & Comput. Eng., California Univ., San Diego, La Jolla, CA, USA
fDate :
11/1/1996 12:00:00 AM
Abstract :
The minimum complexity regression estimation framework (Barron, 1991; Barron and Cover, 1991 and Rissanen, 1989) is a general data-driven methodology for estimating a regression function from a given list of parametric models using independent and identically distributed (i.i.d.) observations. We extend Barron´s regression estimation framework to m-dependent observations and to strongly mixing observations. In particular, we propose abstract minimum complexity regression estimators for dependent observations, which may be adapted to a particular list of parametric models, and establish upper bounds on the statistical risks of the proposed estimators in terms of certain deterministic indices of resolvability. Assuming that the regression function satisfies a certain Fourier-transform-type representation, we examine minimum complexity regression estimators adapted to a list of parametric models based on neural networks and by using the upper bounds for the abstract estimators, we establish rates of convergence for the statistical risks of these estimators. Also, as a key tool, we extend the classical Bernstein inequality from i.i.d. random variables to m-dependent processes and to strongly mixing processes
Keywords :
Fourier transforms; computational complexity; convergence; estimation theory; information theory; minimisation; neural nets; random processes; statistical analysis; Fourier-transform-type representation; abstract minimum complexity regression estimators; classical Bernstein inequality; convergence; dependent observations; deterministic indices of resolvability; general data-driven methodology; i.i.d. observations; i.i.d. random variables; independent and identically distributed observations; m-dependent observations; m-dependent processes; minimum complexity regression estimation; minimum complexity regression estimators; neural networks; parametric models; statistical risks; strongly mixing observations; strongly mixing processes; upper bounds; weakly dependent observations; Approximation error; Convergence; Estimation error; Information theory; Neural networks; Parametric statistics; Random variables; Risk management; Upper bound;
Journal_Title :
Information Theory, IEEE Transactions on