• DocumentCode
    301296
  • Title

    Fusing completing models to improve accuracy

  • Author

    Elder, John F.

  • Author_Institution
    Rice Univ., USA
  • Volume
    1
  • fYear
    1995
  • fDate
    22-25 Oct 1995
  • Firstpage
    149
  • Abstract
    Inductive modeling or “machine learning” algorithms are able to discover structure in high-dimensional data in a nearly automated fashion. These adaptive statistical methods-from decision trees and polynomial networks, to projection pursuit models, additive networks, and cascade correlation neural networks-repeatedly search for, and add on, the model component judged best at that state. Because of the huge model space of possible components, the choice is typical greedy. In fact, it is usual for the analyst and algorithm to be greedy at three levels: when choosing: 1) a term within a model, 2) a model within a family (class of method), and 3) a family within a collection of techniques. It is better in each stage, we argue, to “take a longer view” to: 1) consider terms in larger sets, 2) merge competing models within a family, and 3) to fuse information from disparate models, making the combination more robust. Example benefits of fusion are demonstrated on a challenging classification dataset, where one must infer the species of a bat from its chirps
  • Keywords
    Algorithm design and analysis; Chirp; Decision trees; Fuses; Machine learning algorithms; Neural networks; Polynomials; Protection; Robustness; Training data;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Systems, Man and Cybernetics, 1995. Intelligent Systems for the 21st Century., IEEE International Conference on
  • Conference_Location
    Vancouver, BC
  • Print_ISBN
    0-7803-2559-1
  • Type

    conf

  • DOI
    10.1109/ICSMC.1995.537749
  • Filename
    537749