DocumentCode :
2712959
Title :
Performance of generalized multi-layered perceptrons and layered arbitrarily connected networks trained using the Levenberg-Marquardt method
Author :
Russell, Steele A. ; Maida, Anthony S.
Author_Institution :
Dept. of Comput. Sci. & Ind. Technol., Southeastern Louisiana Univ., Hammond, LA, USA
fYear :
2009
fDate :
14-19 June 2009
Firstpage :
2725
Lastpage :
2731
Abstract :
The generalized multilayer perceptron (gMLP) augments the connections in the multilayered perceptron (MLP) architecture to include all possible non-recurrent connections. The layered arbitrarily connected network (lACN) has connections from input nodes to output nodes in addition to the connections included in a MLP. In this paper the performance of MLP, lACN and gMLP networks trained using the Levenberg-Marquardt method are compared. A number of different function approximation tasks were examined. The effect of varying the number of hidden layer neurons, the error termination condition, and the training set size were also evaluated. The results presented here represent preliminary findings. In particular, additional testing on benchmark real data sets is needed.
Keywords :
function approximation; multilayer perceptrons; Levenberg-Marquardt method; error termination condition; function approximation; generalized multilayered perceptron; hidden layer neuron; layered arbitrarily connected network; training set size; Application software; Benchmark testing; Computational modeling; Computer architecture; Function approximation; Hardware; Multi-layer neural network; Multilayer perceptrons; Neural networks; Neurons;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 2009. IJCNN 2009. International Joint Conference on
Conference_Location :
Atlanta, GA
ISSN :
1098-7576
Print_ISBN :
978-1-4244-3548-7
Electronic_ISBN :
1098-7576
Type :
conf
DOI :
10.1109/IJCNN.2009.5178974
Filename :
5178974
Link To Document :
بازگشت