Title :
Techniques for synthesizing piecewise linear and quadratic neural network classifiers
Author_Institution :
PAR Gov. Syst. Corp., New Hartford, NY, USA
Abstract :
Summary form only given, as follows. Neural network (NN) classifiers have been applied to numerous practical problems of interest. A very common type of NN classifier is the multilayer perceptron, trained with backpropagation. Although this learning procedure has been used successfully in many applications, it has several drawbacks, including susceptibility to local minima and excessive convergence times. The author presents two alternatives to backpropagation for synthesizing NN classifiers. Both procedures generate appropriate network structures and weights in a fast and efficient manner without any gradient descent. The resulting decision rules are optimal under certain conditions; the weights obtained via these procedures can be used ´as is´ or as a starting point for backpropagation.<>
Keywords :
learning systems; neural nets; pattern recognition; backpropagation; decision rules; multilayer perceptron; pattern recognition; piecewise linear neural network classifiers; quadratic neural network classifiers; susceptibility; synthesis; Learning systems; Neural networks; Pattern recognition;
Conference_Titel :
Neural Networks, 1989. IJCNN., International Joint Conference on
Conference_Location :
Washington, DC, USA
DOI :
10.1109/IJCNN.1989.118510