DocumentCode
445941
Title
Self-organizing neural grove and its applications
Author
Inoue, Hirotaka ; Narihisa, Hiroyuki
Author_Institution
Dept. of Electr. Eng. & Information Sci., Kure Coll. of Technol., Hiroshima, Japan
Volume
2
fYear
2005
fDate
31 July-4 Aug. 2005
Firstpage
1205
Abstract
Recently, multiple classifier systems (MCS) have been used for practical applications to improve classification accuracy. Self-generating neural networks (SGNN) are one of the suitable base-classifiers for MCS because of their simple setting and fast learning. However, the computation cost of the MCS increases in proportion to the number of SGNN. In this paper, we propose a novel pruning method for efficient classification and we call this model as self-organizing neural grove (SONG). Experiments have been conducted to compare the pruned MCS with an unpruned MCS, the MCS based on C4.5, and k-nearest neighbor method. The results show that the pruned MCS can improve its classification accuracy as well as reducing the computation cost.
Keywords
neural nets; pattern classification; classification accuracy; multiple classifier systems; pruning method; self-generating neural networks; self-organizing neural grove; Backpropagation; Bagging; Boosting; Classification tree analysis; Computational efficiency; Data mining; Educational institutions; Electronic mail; Information science; Neural networks;
fLanguage
English
Publisher
ieee
Conference_Titel
Neural Networks, 2005. IJCNN '05. Proceedings. 2005 IEEE International Joint Conference on
Print_ISBN
0-7803-9048-2
Type
conf
DOI
10.1109/IJCNN.2005.1556025
Filename
1556025
Link To Document