• DocumentCode
    1323659
  • Title

    Efficient Learning of Sparse Conditional Random Fields for Supervised Sequence Labeling

  • Author

    Sokolovska, Nataliya ; Lavergne, Thomas ; Cappé, Olivier ; Yvon, François

  • Author_Institution
    LTCI, Telecom ParisTech, Paris, France
  • Volume
    4
  • Issue
    6
  • fYear
    2010
  • Firstpage
    953
  • Lastpage
    964
  • Abstract
    Conditional random fields (CRFs) constitute a popular and efficient approach for supervised sequence labeling. CRFs can cope with large description spaces and can integrate some form of structural dependency between labels. In this paper, we address the issue of efficient feature selection for CRFs based on imposing sparsity through an 1 penalty. We first show how sparsity of the parameter set can be exploited to significantly speed up training and labeling. We then introduce coordinate descent parameter update schemes for CRFs with ℓ1 regularization. We finally provide some empirical comparisons of the proposed approach with state-of-the-art CRF training strategies. In particular, it is shown that the proposed approach is able to take profit of the sparsity to speed up processing and handle larger dimensional models.
  • Keywords
    learning (artificial intelligence); probability; speech recognition; ℓ1 penalty; ℓ1 regularization; coordinate descent parameter update; feature selection; sparse conditional random fields; structural dependency; supervised sequence labeling; Error analysis; Labeling; Machine learning; Predictive models; Stochastic processes; Supervised learning; Machine learning; predictive models; supervised learning;
  • fLanguage
    English
  • Journal_Title
    Selected Topics in Signal Processing, IEEE Journal of
  • Publisher
    ieee
  • ISSN
    1932-4553
  • Type

    jour

  • DOI
    10.1109/JSTSP.2010.2076150
  • Filename
    5570917