• DocumentCode
    1797307
  • Title

    A decomposition method for large-scale sparse coding in representation learning

  • Author

    Yifeng Li ; Caron, Richard J. ; Ngom, Alioune

  • Author_Institution
    Child & Family Res. Inst., Univ. of British Columbia, Vancouver, BC, Canada
  • fYear
    2014
  • fDate
    6-11 July 2014
  • Firstpage
    3732
  • Lastpage
    3738
  • Abstract
    In representation learning, sparse representation is a parsimonious principle that a sample can be approximated by a sparse superposition of dictionary atoms. Sparse coding is the core of this technique. Since the dictionary is often redundant, the dictionary size can be very large. Many optimization methods have been proposed in the literature for sparse coding. However, the efficiency of the optimization for a tremendous number of dictionary atoms is still a bottleneck. In this paper, we propose to use decomposition method for large-scale sparse coding models. Our experimental results show that our method is very efficient.
  • Keywords
    encoding; learning (artificial intelligence); quadratic programming; decomposition method; large-scale sparse coding models; parsimonious principle; quadratic programming; representation learning; sparse dictionary atoms superposition; sparse representation; Computational modeling; Dictionaries; Encoding; Equations; Mathematical model; Optimization; Vectors;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks (IJCNN), 2014 International Joint Conference on
  • Conference_Location
    Beijing
  • Print_ISBN
    978-1-4799-6627-1
  • Type

    conf

  • DOI
    10.1109/IJCNN.2014.6889394
  • Filename
    6889394