Title :
Linear-least-squares initialization of multilayer perceptrons through backpropagation of the desired response
Author :
Erdogmus, Deniz ; Fontenla-Romero, Oscar ; Principe, Jose C. ; Alonso-Betanzos, Amparo ; Castillo, Enrique
Author_Institution :
Dept. of Comput. Sci. & Eng., Oregon Health Sci. Univ., Portland, OR, USA
fDate :
3/1/2005 12:00:00 AM
Abstract :
Training multilayer neural networks is typically carried out using descent techniques such as the gradient-based backpropagation (BP) of error or the quasi-Newton approaches including the Levenberg-Marquardt algorithm. This is basically due to the fact that there are no analytical methods to find the optimal weights, so iterative local or global optimization techniques are necessary. The success of iterative optimization procedures is strictly dependent on the initial conditions, therefore, in this paper, we devise a principled novel method of backpropagating the desired response through the layers of a multilayer perceptron (MLP), which enables us to accurately initialize these neural networks in the minimum mean-square-error sense, using the analytic linear least squares solution. The generated solution can be used as an initial condition to standard iterative optimization algorithms. However, simulations demonstrate that in most cases, the performance achieved through the proposed initialization scheme leaves little room for further improvement in the mean-square-error (MSE) over the training set. In addition, the performance of the network optimized with the proposed approach also generalizes well to testing data. A rigorous derivation of the initialization algorithm is presented and its high performance is verified with a number of benchmark training problems including chaotic time-series prediction, classification, and nonlinear system identification with MLPs.
Keywords :
backpropagation; iterative methods; least mean squares methods; multilayer perceptrons; optimisation; global optimization techniques; gradient-based backpropagation; iterative optimization procedures; linear-least-squares initialization; minimum mean-square-error; multilayer neural networks; multilayer perceptrons; quasi-Newton approach; Backpropagation algorithms; Benchmark testing; Chaos; Iterative algorithms; Iterative methods; Least squares methods; Multi-layer neural network; Multilayer perceptrons; Neural networks; Optimization methods; Approximate least-squares training of multilayer perceptrons (MLPs); backpropagation (BP) of desired response; neural network initialization;
Journal_Title :
Neural Networks, IEEE Transactions on
DOI :
10.1109/TNN.2004.841777