Title :
Performance of generalized multi-layered perceptrons and layered arbitrarily connected networks trained using the Levenberg-Marquardt method
Author :
Russell, Steele A. ; Maida, Anthony S.
Author_Institution :
Dept. of Comput. Sci. & Ind. Technol., Southeastern Louisiana Univ., Hammond, LA, USA
Abstract :
The generalized multilayer perceptron (gMLP) augments the connections in the multilayered perceptron (MLP) architecture to include all possible non-recurrent connections. The layered arbitrarily connected network (lACN) has connections from input nodes to output nodes in addition to the connections included in a MLP. In this paper the performance of MLP, lACN and gMLP networks trained using the Levenberg-Marquardt method are compared. A number of different function approximation tasks were examined. The effect of varying the number of hidden layer neurons, the error termination condition, and the training set size were also evaluated. The results presented here represent preliminary findings. In particular, additional testing on benchmark real data sets is needed.
Keywords :
function approximation; multilayer perceptrons; Levenberg-Marquardt method; error termination condition; function approximation; generalized multilayered perceptron; hidden layer neuron; layered arbitrarily connected network; training set size; Application software; Benchmark testing; Computational modeling; Computer architecture; Function approximation; Hardware; Multi-layer neural network; Multilayer perceptrons; Neural networks; Neurons;
Conference_Titel :
Neural Networks, 2009. IJCNN 2009. International Joint Conference on
Conference_Location :
Atlanta, GA
Print_ISBN :
978-1-4244-3548-7
Electronic_ISBN :
1098-7576
DOI :
10.1109/IJCNN.2009.5178974