DocumentCode :
2252191
Title :
A growing architecture selection for Multilayer Perceptron Neural Network by the L-GEM
Author :
Li, Jin-cheng ; Ng, Wing W Y ; Chan, Patrick P K ; Yeung, Daniel S.
Author_Institution :
Machine Learning & Cybern. Res. Center, South China Univ. of Technol., Guangzhou, China
Volume :
3
fYear :
2010
fDate :
11-14 July 2010
Firstpage :
1402
Lastpage :
1407
Abstract :
The number of hidden neurons has a great influence on the generalization capability of Multilayer Perceptron Neural Network (MLPNN). The ultimate goal of building a MLPNN is to recognize (or generalize) future unseen sample correctly based on the training from training samples. Therefore, the Localized Generalization Error Model (L-GEM) is adopted in this work to select the architecture of a MLPNN. The L-GEM has been successfully applied to Radial Basis Function Neural Network (RBFNN) architecture selection, feature selection and other applications. In this work, we propose a new L-GEM for MLPNN and demonstrate its application in architecture selection for MLPNN. Experimental results show that the L-GEM based MLPNN architecture selection method outperforms several off-the-shelf methods.
Keywords :
multilayer perceptrons; radial basis function networks; L-GEM; architecture selection; hidden neurons; localized generalization error model; multilayer perceptron; radial basis function neural network; Accuracy; Complexity theory; Computer architecture; Neurons; Sensitivity; Testing; Training; Architecture Selection; L-GEM; MultiLayer Perceptrons Neural Network (MLPNN); Quasi-Monte Carlo;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Machine Learning and Cybernetics (ICMLC), 2010 International Conference on
Conference_Location :
Qingdao
Print_ISBN :
978-1-4244-6526-2
Type :
conf
DOI :
10.1109/ICMLC.2010.5580850
Filename :
5580850
Link To Document :
بازگشت