Title :
FastProp: a selective training algorithm for fast error propagation
Author_Institution :
Inst. of Syst. Sci., Nat. Univ. of Singapore, Kent Ridge, Singapore
Abstract :
An improved backpropagation algorithm, called FastProp, for training a feedforward neural network is described. The unique feature of the algorithm is the selective training which is based on the instantaneous causal relationship between the input and output signals during the training process. The causal relationship is calculated based on the error backpropagated to the input layers. The accumulated error, referred to as the accumulated error indices (AEIs), are used to rank the input signals according to their correlation relation with the output signals. An entire set of time series data can be clustered into several situations based on the current input signal which has the highest AEI index, and the neurons can be activated based on the current situations. Experimental results showed that a significant reduction in training time can be achieved with the selective training algorithm compared to the traditional backpropagation algorithm
Keywords :
error statistics; learning systems; neural nets; parallel algorithms; time series; FastProp; accumulated error indices; backpropagation algorithm; correlation; fast error propagation; feedforward neural network; selective training algorithm; time series data; Adaptive filters; Economic indicators; Feedforward neural networks; Feeds; Neural networks; Neurons; Predictive models; Signal processing algorithms; Testing; Training data;
Conference_Titel :
Neural Networks, 1991. 1991 IEEE International Joint Conference on
Print_ISBN :
0-7803-0227-3
DOI :
10.1109/IJCNN.1991.170635