• DocumentCode
    73281
  • Title

    Convergence and Consistency of Regularized Boosting With Weakly Dependent Observations

  • Author

    Lozano, Aurelie C. ; Kulkarni, Sanjeev R. ; Schapire, Robert E.

  • Author_Institution
    IBM T. J. Watson Res. Center, Yorktown Heights, NY, USA
  • Volume
    60
  • Issue
    1
  • fYear
    2014
  • fDate
    Jan. 2014
  • Firstpage
    651
  • Lastpage
    660
  • Abstract
    This paper studies the statistical convergence and consistency of regularized boosting methods, where the samples need not be independent and identically distributed but can come from stationary weakly dependent sequences. Consistency is proven for the composite classifiers that result from a regularization achieved by restricting the 1-norm of the base classifiers´ weights. The less restrictive nature of sampling considered here is manifested in the consistency result through a generalized condition on the growth of the regularization parameter. The weaker the sample dependence, the faster the regularization parameter is allowed to grow with increasing sample size. A consistency result is also provided for data-dependent choices of the regularization parameter.
  • Keywords
    data handling; learning (artificial intelligence); pattern classification; statistical analysis; composite classifiers; machine learning; regularization parameter; regularized boosting methods; statistical convergence; weakly dependent observations; Boosting; Convergence; Cost function; Minimization; Prediction algorithms; Random variables; Training data; Bayes-risk consistency; beta-mixing; boosting; classification; dependent data; empirical processes; memory; non-iid; penalized model selection; regularization;
  • fLanguage
    English
  • Journal_Title
    Information Theory, IEEE Transactions on
  • Publisher
    ieee
  • ISSN
    0018-9448
  • Type

    jour

  • DOI
    10.1109/TIT.2013.2287726
  • Filename
    6650087