• DocumentCode
    3545176
  • Title

    Self-organizing neural grove: effective multiple classifier system with pruned self-generating neural trees

  • Author

    Inoue, Hirotaka ; Narihisa, Hiroyki

  • Author_Institution
    Dept. of Electr. Eng. & Inf. Sci., Kure Nat. Coll. of Technol., Hiroshima, Japan
  • fYear
    2005
  • fDate
    23-26 May 2005
  • Firstpage
    2502
  • Abstract
    Multiple classifier systems (MCS) have become popular during the last decade. The self-generating neural tree (SGNT) is one of the suitable base-classifiers for MCS because of the simple setting and fast learning. However, the computation cost of the MCS increases in proportion to the number of SGNTs. In an earlier paper, we proposed a pruning method for the structure of the SGNT in the MCS to reduce the computation cost. In this paper, we propose a novel pruning method for effective processing and we call this model the self-organizing neural grove (SONG). The pruning method is constructed from an on-line pruning method and a off-line pruning method. We implement the SONG with two sampling methods. Experiments have been conducted to compare the SONG with an unpruned MCS based on SGNT, the MCS based on C4.5, and k-nearest neighbor method. The results show that the SONG can improve its classification accuracy as well as reducing the computation cost.
  • Keywords
    pattern classification; self-organising feature maps; unsupervised learning; MCS; SGNT structure pruning method; SONG classification accuracy; competitive learning; multiple classifier system; off-line pruning; on-line pruning; pruned self-generating neural trees; self-organizing maps; self-organizing neural grove; Bagging; Boosting; Classification tree analysis; Computational efficiency; Data mining; Educational institutions; Information science; Neural networks; Sampling methods; Training data;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Circuits and Systems, 2005. ISCAS 2005. IEEE International Symposium on
  • Print_ISBN
    0-7803-8834-8
  • Type

    conf

  • DOI
    10.1109/ISCAS.2005.1465134
  • Filename
    1465134