Title :
Piecewise Polynomial Estimation of a Regression Function
Author_Institution :
Lab. de Math., Univ. Paris Sud, Orsay, France
Abstract :
We deal with the problem of choosing a piecewise polynomial estimator of a regression function s mapping [0,1] p into R. In a first part of this paper, we consider some collection of piecewise polynomial models. Each model is defined by a partition M of [0,1] p and a series of degrees d = (d J)J¿M ¿ NM. We propose a penalized least squares criterion which selects a model whose associated piecewise polynomial estimator performs approximately as well as the best one, in the sense that its quadratic risk is close to the infimum of the risks. The risk bound we provide is nonasymptotic. In a second part, we apply this result to tree-structured collections of partitions, which look like the one constructed in the first step of the CART algorithm. And we propose an extension of the CART algorithm to build a piecewise polynomial estimator of a regression function.
Keywords :
least squares approximations; piecewise polynomial techniques; regression analysis; trees (mathematics); CART algorithm; nonasymptotic risk bound; penalized least squares criterion; piecewise polynomial estimation; piecewise polynomial model; quadratic risk; regression function; Least squares approximation; Least squares methods; Partitioning algorithms; Pattern recognition; Polynomials; Statistical learning; CART; concentration inequalities; model selection; oracle inequalities; polynomial estimation; regression;
Journal_Title :
Information Theory, IEEE Transactions on
DOI :
10.1109/TIT.2009.2027481