Title : 
Parallel and distributed mining with ensemble self-generating neural networks
         
        
            Author : 
Inoue, Hirotaka ; Narihisa, Hiroyuki
         
        
            Author_Institution : 
Graduate Sch. of Eng., Okayama Univ. of Sci., Japan
         
        
        
        
        
        
            Abstract : 
In this paper, we present the improving capability of accuracy and the parallel efficiency of ensemble self-generating neural networks (ESGNNs) for classification on a MIMD parallel computer. Self-generating neural networks (SGNNs) are originally proposed for classification or clustering by automatically constructing self-generating neural tree (SGNT) from given training data. ESGNNs are composed of plural SGNTs each of which is independently generated by shuffling the order of the given training data, and the output of ESGNNs are averaged outputs of the SGNTs. We allocate each of SGNTs to each of processors in the MIMD parallel computer. Experimental results show that the more the number of processors, the more the misclassification rate decreases for all problems
         
        
            Keywords : 
data mining; parallel processing; self-organising feature maps; MIMD parallel computer; distributed mining; ensemble self-generating neural networks; parallel mining; self-generating neural tree; training data; Backpropagation algorithms; Bagging; Classification tree analysis; Computer networks; Concurrent computing; Convergence; Data engineering; Neural networks; Neurons; Training data;
         
        
        
        
            Conference_Titel : 
Parallel and Distributed Systems, 2001. ICPADS 2001. Proceedings. Eighth International Conference on
         
        
            Conference_Location : 
Kyongju City
         
        
        
            Print_ISBN : 
0-7695-1153-8
         
        
        
            DOI : 
10.1109/ICPADS.2001.934849