• DocumentCode
    1365541
  • Title

    Piecewise Polynomial Estimation of a Regression Function

  • Author

    Sauvé, Marie

  • Author_Institution
    Lab. de Math., Univ. Paris Sud, Orsay, France
  • Volume
    56
  • Issue
    1
  • fYear
    2010
  • Firstpage
    597
  • Lastpage
    613
  • Abstract
    We deal with the problem of choosing a piecewise polynomial estimator of a regression function s mapping [0,1] p into R. In a first part of this paper, we consider some collection of piecewise polynomial models. Each model is defined by a partition M of [0,1] p and a series of degrees d = (d J)J¿M ¿ NM. We propose a penalized least squares criterion which selects a model whose associated piecewise polynomial estimator performs approximately as well as the best one, in the sense that its quadratic risk is close to the infimum of the risks. The risk bound we provide is nonasymptotic. In a second part, we apply this result to tree-structured collections of partitions, which look like the one constructed in the first step of the CART algorithm. And we propose an extension of the CART algorithm to build a piecewise polynomial estimator of a regression function.
  • Keywords
    least squares approximations; piecewise polynomial techniques; regression analysis; trees (mathematics); CART algorithm; nonasymptotic risk bound; penalized least squares criterion; piecewise polynomial estimation; piecewise polynomial model; quadratic risk; regression function; Least squares approximation; Least squares methods; Partitioning algorithms; Pattern recognition; Polynomials; Statistical learning; CART; concentration inequalities; model selection; oracle inequalities; polynomial estimation; regression;
  • fLanguage
    English
  • Journal_Title
    Information Theory, IEEE Transactions on
  • Publisher
    ieee
  • ISSN
    0018-9448
  • Type

    jour

  • DOI
    10.1109/TIT.2009.2027481
  • Filename
    5361480