• DocumentCode
    928845
  • Title

    Parameter tuning for induction-algorithm-oriented feature elimination

  • Author

    Yang, Ying ; Wu, Xindong

  • Author_Institution
    Dept. of Comput. Sci., Vermont Univ., Burlington, VT, USA
  • Volume
    19
  • Issue
    2
  • fYear
    2004
  • Firstpage
    40
  • Lastpage
    49
  • Abstract
    Feature selection has long been an active research topic in machine learning. Beginning with an empty set of features, it selects features most necessary for learning a target concept. Feature elimination, a newer technique, starts out with a full set of features and eliminates those most unnecessary for learning the target concept. Feature elimination tends to be more effective, can capture interacting features more easily, and suffers less from feature interaction than feature selection. Because the most unnecessary features are eliminated from the beginning, they will not mislead the induction process in terms of efficiency or accuracy. Induction-algorithm-oriented feature elimination, with particular parameter configurations, can achieve higher predictive accuracy than existing popular feature selection approaches. We propose two sets of well-tuned parameters based on empirical analysis. To understand how to achieve the best performance possible from IAOFE, we conducted a comprehensive analysis of IAOFE parameter tuning.
  • Keywords
    Bayes methods; learning by example; IAOFE parameter tuning; empirical analysis; feature interaction; feature selection; induction-algorithm-oriented feature elimination; machine learning; parameter configuration; Accuracy; Annealing; Blindness; Decision trees; Genetics; Machine learning; Performance analysis; Predictive models; Training data; Voting;
  • fLanguage
    English
  • Journal_Title
    Intelligent Systems, IEEE
  • Publisher
    ieee
  • ISSN
    1541-1672
  • Type

    jour

  • DOI
    10.1109/MIS.2004.1274910
  • Filename
    1274910