Title :
Increased generalization through selective decay in a constructive cascade network
Author :
Treadgold, N.K. ; Gedcon, T.D.
Author_Institution :
Dept. of Inf. Eng., New South Wales Univ., Kensington, NSW, Australia
Abstract :
Determining the optimum amount of regularization to obtain the best generalization performance in feedforward neural networks is a difficult problem, and is a form of the bias-variance dilemma. This problem is addressed in the CasPer algorithm, a constructive cascade algorithm that uses weight decay. Previously the amount of weight decay used by this algorithm was set by a parameter prior to training, often by trial and error. This is overcome through the use of a pool of neurons which are candidates for insertion into the network. Each neuron in the pool has an associated decay level, and the one which produces the best generalization on a validation set is added to the network. This not only removes the need for the user to select a decay value, but results in better generalization compared to networks with fixed, user optimized, decay values
Keywords :
feedforward neural nets; generalisation (artificial intelligence); learning (artificial intelligence); CasPer algorithm; bias-variance dilemma; constructive cascade algorithm; feedforward neural networks; generalization; selective decay; weight decay; Australia; Computer science; Feedforward neural networks; Intelligent networks; Neural networks; Neurons; Poles and towers;
Conference_Titel :
Systems, Man, and Cybernetics, 1998. 1998 IEEE International Conference on
Conference_Location :
San Diego, CA
Print_ISBN :
0-7803-4778-1
DOI :
10.1109/ICSMC.1998.727553