Title :
Optimum block-adaptive learning algorithm for error back-propagation networks
Author :
Du, Li-Min ; Hou, Zi-Qiang ; Li, Qi-Hu
Author_Institution :
Inst. of Acoust., Chinese Acad. of Sci., Beijing, China
fDate :
12/1/1992 12:00:00 AM
Abstract :
An optimum block-adaptive learning rate (OBALR) backpropagation (BP) algorithm for training feedforward neural networks with an arbitrary number of neuron layers is described. The algorithm uses block-smoothed gradient as direction for descent and no momentum term, but produces an optimum block-adaptive learning rate which is constant within each block and is updated adaptively at the beginning of each block iteration so that it is kept optimum in a sense of minimizing the approximate output mean-square error of the block. Several computer simulations were tested on learning a deterministic chaos time-series mapping. The OBALR BP algorithm not only overcame the difficulty in choosing good values of the two parameters, but also provided significant improvement on learning speed and descent capability over the standard BP algorithm
Keywords :
backpropagation; feedforward neural nets; OBALR BP algorithm; backpropagation networks; block-smoothed gradient; descent capability; deterministic chaos time-series mapping; feedforward neural networks; learning algorithm; mean-square error; neural network training; optimum block-adaptive learning rate; Algorithm design and analysis; Chaos; Computer simulation; Feedforward neural networks; Helium; Neural networks; Neurons; Shape measurement; Signal processing algorithms; Testing;
Journal_Title :
Signal Processing, IEEE Transactions on