Title :
On the initialization and optimization of multilayer perceptrons
Author :
Weymaere, Nico ; Martens, Jean-Pierre
Author_Institution :
Dept. of Electron. & Inf. Syst., Gent Univ., Belgium
fDate :
9/1/1994 12:00:00 AM
Abstract :
Multilayer perceptrons are now widely used for pattern recognition, although the training remains a time consuming procedure often converging toward a local optimum. Moreover, as the optimum network size and topology are usually unknown, the search of this optimum requires a lot of networks to be trained. In this paper the authors propose a method for properly initializing the parameters (weights) of a two-layer perceptron, and for identifying (without the need for any error-backpropagation training) the most suitable network size and topology for solving the problem under investigation. The initialized network can then be optimized by means of the standard error-backpropagation (EBP) algorithm. The authors´ method is applicable to any two-layer perceptron comprising concentric as well as squashing units on its hidden layer. The output units are restricted to squashing units, but direct connections from the input to the output layer are also accommodated. To illustrate the power of the method, results obtained for different classification tasks are compared to similar results obtained using a traditional error-backpropagation training starting from a random initial state
Keywords :
feedforward neural nets; network topology; optimisation; pattern recognition; classification tasks; error-backpropagation training; initialization; multilayer perceptrons; optimization; optimum network size; pattern recognition; topology; training; two-layer perceptron; Clustering algorithms; Feedforward neural networks; Feedforward systems; Information systems; Input variables; Multilayer perceptrons; Network topology; Neural networks; Pattern classification; Pattern recognition;
Journal_Title :
Neural Networks, IEEE Transactions on