Title :
Training multilayer perceptron classifiers based on a modified support vector method
Author :
Suykens, J.A.K. ; Vandewalle, J.
Author_Institution :
Dept. of Electr. Eng., Katholieke Univ., Leuven, Belgium
fDate :
7/1/1999 12:00:00 AM
Abstract :
In this paper we describe a training method for one hidden layer multilayer perceptron classifier which is based on the idea of support vector machines (SVM). An upper bound on the Vapnik-Chervonenkis (VC) dimension is iteratively minimized over the interconnection matrix of the hidden layer and its bias vector. The output weights are determined according to the support vector method, but without making use of the classifier form which is related to Mercer´s condition. The method is illustrated on a two-spiral classification problem
Keywords :
iterative methods; learning (artificial intelligence); minimax techniques; multilayer perceptrons; pattern classification; SVM; VC dimension; Vapnik-Chervonenkis dimension; bias vector; interconnection matrix; iterative minimization; modified support vector method; one-hidden-layer multilayer perceptron classifier training; support vector machines; two-spiral classification problem; upper bound; Backpropagation; Kernel; Multilayer perceptrons; Quadratic programming; Radial basis function networks; Risk management; Support vector machine classification; Support vector machines; Upper bound; Virtual colonoscopy;
Journal_Title :
Neural Networks, IEEE Transactions on