• DocumentCode
    401684
  • Title

    Using mutual information for selecting continuous-valued attribute in decision tree learning

  • Author

    Li, Hua ; Wang, Xi-zhao ; Li, Yong

  • Author_Institution
    Fac. of Math. & Comput. Sci., Hebei Univ., China
  • Volume
    3
  • fYear
    2003
  • fDate
    2-5 Nov. 2003
  • Firstpage
    1496
  • Abstract
    In this paper, we proposed a learning algorithm using the information entropy minimization heuristic and mutual information entropy heuristic to select expanded attributes. For a data set of which the values of condition attributes are continuous, most of the current decision trees learning algorithms often select the previously selected attributes for branching. The repeated selection limits the accuracy of training and testing and the structure of decision trees may become complex. So in the selection of attributes, the previously selected attributes and the other attributes, which have high correlation to the previously selected attributes, should not be selected again. Here, we use mutual information to avoid selecting the previously selected attributes in the generation of decision trees and our test results show that this method can obtain good performance.
  • Keywords
    decision trees; entropy; heuristic programming; learning (artificial intelligence); minimisation; continuous-valued attribute; decision tree learning; information entropy minimization heuristic; mutual information entropy heuristic; Computer science; Decision trees; Information entropy; Information theory; Machine learning; Machine learning algorithms; Mathematics; Mutual information; Stochastic processes; Testing;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Machine Learning and Cybernetics, 2003 International Conference on
  • Print_ISBN
    0-7803-8131-9
  • Type

    conf

  • DOI
    10.1109/ICMLC.2003.1259731
  • Filename
    1259731