Title :
An improved algorithm for neural network classification of imbalanced training sets
Author :
Anand, Rangachari ; Mehrotra, Kishan G. ; Mohan, Chilukuri K. ; Ranka, Sanjay
fDate :
11/1/1993 12:00:00 AM
Abstract :
The backpropagation algorithm converges very slowly for two-class problems in which most of the exemplars belong to one dominant class. An analysis shows that this occurs because the computed net error gradient vector is dominated by the bigger class so much that the net error for the exemplars in the smaller class increases significantly in the initial iteration. The subsequent rate of convergence of the net error is very low. A modified technique for calculating a direction in weight-space which decreases the error for each class is presented. Using this algorithm, the rate of learning for two-class classification problems is accelerated by an order of magnitude
Keywords :
backpropagation; convergence; learning (artificial intelligence); neural nets; pattern recognition; backpropagation algorithm; computed net error gradient vector; convergence; imbalanced training sets; iteration; neural network classification; two-class classification problems; Acceleration; Algorithm design and analysis; Backpropagation algorithms; Convergence; Information science; Neural networks;
Journal_Title :
Neural Networks, IEEE Transactions on