Title :
Objective functions for training new hidden units in constructive neural networks
Author :
Kwok, Tin-Yau ; Yeung, Dit-Yan
Author_Institution :
Dept. of Comput. Sci., Hong Kong Univ. of Sci. & Technol., Kowloon, Hong Kong
fDate :
9/1/1997 12:00:00 AM
Abstract :
In this paper, we study a number of objective functions for training new hidden units in constructive algorithms for multilayer feedforward networks. The aim is to derive a class of objective functions the computation of which and the corresponding weight updates can be done in O(N) time, where N is the number of training patterns. Moreover, even though input weight freezing is applied during the process for computational efficiency, the convergence property of the constructive algorithms using these objective functions is still preserved. We also propose a few computational tricks that can be used to improve the optimization of the objective functions under practical situations. Their relative performance in a set of two-dimensional regression problems is also discussed
Keywords :
computational complexity; convergence of numerical methods; correlation methods; feedforward neural nets; learning (artificial intelligence); optimisation; 2D regression problems; cascade correlation; constructive neural networks; convergence; hidden units; input weight freezing; multilayer feedforward networks; objective functions; optimization; time complexity; weight updates; Backpropagation algorithms; Computational efficiency; Computer networks; Convergence; Feedforward neural networks; Intelligent networks; Multi-layer neural network; Neural networks; Pattern classification; Polynomials;
Journal_Title :
Neural Networks, IEEE Transactions on