Title :
An Information Gain Technique for Acceleration of Convergence of Artificial Neural Networks
Author :
Jearanaitanakij, Kietikul ; Pinngern, Ouen
Author_Institution :
Dept. of Comput. Eng., King Mongkut´´s Inst. of Technol., Bangkok
Abstract :
This paper presents an application of information gain to accelerate the convergence time of artificial neural networks (ANNs). We improve Hagiwara´s convergence acceleration algorithm by applying information gain to it. The first step of our proposed technique is to calculate information gains of all features (or attributes) in training data and pass those gains through all hidden units in the next layer. During the training process, the algorithm monitors sum-squared error at the output layer. When the variation of sum-squared error becomes small, the worst hidden unit is detected. Next, all the weights connected to the worst hidden unit are reset to random values within the appropriate ranges. These ranges are determined by the propagated information gain of the worst hidden unit. Then, the network is retrained. When the number of weight resetting trials reaches a certain number, a new hidden unit is added to the network and the whole training process is repeated. Our experimental results on standard benchmarks show remarkable outputs in terms of convergence time
Keywords :
artificial intelligence; neural nets; ANN; Hagiwara´s convergence acceleration algorithm; artificial neural network; convergence time acceleration; information gain technique; standard benchmark; sum-squared error; training process; Acceleration; Application software; Artificial neural networks; Backpropagation; Computer networks; Convergence; Entropy; Neural networks; Testing; Training data; Artificial Neural Network; classification; convergence acceleration; information gain;
Conference_Titel :
Information, Communications and Signal Processing, 2005 Fifth International Conference on
Conference_Location :
Bangkok
Print_ISBN :
0-7803-9283-3
DOI :
10.1109/ICICS.2005.1689065