DocumentCode :
3250032
Title :
Classification with neural networks: a performance analysis
Author :
Hush, D.R.
Author_Institution :
Dept. of Electr. Eng. & Comput. Eng., New Mexico Univ., Albuquerque, NM, USA
fYear :
1989
fDate :
0-0 1989
Firstpage :
277
Lastpage :
280
Abstract :
A performance analysis is presented for the most popular neural network classifier, the multilayer perceptron (MLP). The analysis is performed for a specific class of pattern recognition problems called one-class classifier problems. The criteria used to measure the performance are classification error, computational complexity (measured in terms of the network size), sensitivity to network size selection, and number of training samples required. With regard to the network size it is shown that networks with one hidden layer perform better than those with two hidden layers. Further, a lower bound on the number of nodes in the hidden layer is derived and found to be d+1, where d is the dimension of the data patterns. The optimal number of nodes is shown to be somewhat larger than this (approximately 3d). In addition the network performance is shown to be relatively insensitive to overspecification of the network size. Finally it is shown that for near-optimal performance the number of training samples should be approximately 60d(d+1).<>
Keywords :
computational complexity; learning systems; neural nets; pattern recognition; performance evaluation; classification error; computational complexity; hidden layers; multilayer perceptron; network size; neural network classifier; pattern recognition; performance analysis; sensitivity; training samples; Complexity theory; Learning systems; Neural networks; Pattern recognition;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Systems Engineering, 1989., IEEE International Conference on
Conference_Location :
Fairborn, OH, USA
Type :
conf
DOI :
10.1109/ICSYSE.1989.48672
Filename :
48672
Link To Document :
بازگشت