• DocumentCode
    2189891
  • Title

    Stochastic Mirror Descent Algorithm for L1-Regularized Risk Minimizations

  • Author

    Ouyang, Hua ; Gray, Alexander

  • Author_Institution
    Coll. of Comput., Georgia Inst. of Technol., Atlanta, GA, USA
  • fYear
    2010
  • fDate
    June 29 2010-July 1 2010
  • Firstpage
    1241
  • Lastpage
    1245
  • Abstract
    L1-regularized empirical risk minimization is a popular method for feature selections. In this paper we propose a fast online algorithms for solving large-scale L1-regularized problems. The proposed stochastic mirror descent algorithm is a stochastic version of the mirror-prox method. We show that mirror descent is a unified framework of several recently proposed algorithms. Experiments on large-scale datasets demonstrate that the proposed SMD algorithm is much faster than the recently proposed truncated gradient algorithm (TG). At the same testing accuracy, SMD yields sparser solutions than TG, while at the same sparseness, SMD has a higher testing accuracy.
  • Keywords
    data analysis; feature extraction; gradient methods; risk analysis; stochastic processes; L1-regularized risk minimization; SMD algorithm; fast online algorithm; feature selection; large-scale dataset; mirror-prox method; stochastic mirror descent algorithm; Accuracy; Approximation algorithms; Entropy; Least squares approximation; Mirrors; Testing; Empirical Risk Minimization; L1 regularization; Machine Learning; Sparsity;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Computer and Information Technology (CIT), 2010 IEEE 10th International Conference on
  • Conference_Location
    Bradford
  • Print_ISBN
    978-1-4244-7547-6
  • Type

    conf

  • DOI
    10.1109/CIT.2010.224
  • Filename
    5577880