Title : 
Incremental learning using self-organizing neural grove
         
        
            Author : 
Inoue, H. ; Narihisa, H.
         
        
            Author_Institution : 
Kure Nat. Coll. of Technol., Japan
         
        
        
        
        
            Abstract : 
Summary form only given. Multiple classifier systems (MCS) have become popular during the last decade. The self-generating neural tree (SGNT) is one of the suitable base-classifiers for MCS because of the simple setting and fast learning. In an earlier paper, we proposed a pruning method for the structure of the SGNT in the MCS to reduce the computational cost and we called this model the self-organizing neural grove (SONG). In this paper, we investigate the performance of incremental learning using SONG for a large scale classification problem. The results show that the SONG can improve its classification accuracy as well as reducing the computational cost in incremental learning.
         
        
            Keywords : 
classification; learning (artificial intelligence); neural nets; self-adjusting systems; MCS; SGNT structure pruning method; SONG; classification accuracy; incremental learning; large scale classification; multiple classifier systems; self-generating neural tree; self-organizing neural grove; Computational efficiency; Large-scale systems;
         
        
        
        
            Conference_Titel : 
Nonlinear Signal and Image Processing, 2005. NSIP 2005. Abstracts. IEEE-Eurasip
         
        
            Conference_Location : 
Sapporo
         
        
            Print_ISBN : 
0-7803-9064-4
         
        
        
            DOI : 
10.1109/NSIP.2005.1502290