Title :
Acceleration of back propagation through initial weight pre-training with delta rule
Author :
Li, Gang ; Alnuweiri, Hussein ; Wu, Yuejian ; Li, Hongbing
Author_Institution :
British Columbia Univ., Vancouver, BC, Canada
Abstract :
A training strategy for backpropagation (BP) neural networks, named delta pre-training (DPT), is proposed. The core of the training strategy is based on pre-training the initial weights for BP networks using the delta rule, instead of using random values. After pre-training, the normal BP training procedure is carried out to complete network training. With the DPT, the convergence rate for training BP networks can be significantly improved. Since the DPT deals only with initial weight settings, most variations of the standard BP algorithm can be combined with the DPT so as to further speed up convergence. With regards to on-chip learning in VLSI implementations, only a little additional circuitry is required for the pre-training phase with the DPT. Simulation results using the proposed training method show its superiority over previous methods
Keywords :
VLSI; backpropagation; convergence; learning (artificial intelligence); neural chips; back propagation; convergence; convergence rate; delta rule; initial weight pre-training; initial weight settings; on-chip learning; Acceleration; Algebra; Computer science; Convergence; Feedforward systems; Hardware; Neural networks; Neurons; Software algorithms; Very large scale integration;
Conference_Titel :
Neural Networks, 1993., IEEE International Conference on
Conference_Location :
San Francisco, CA
Print_ISBN :
0-7803-0999-5
DOI :
10.1109/ICNN.1993.298622