Title :
Noise suppression in training data for improving generalization
Author :
Nakashima, Akiko ; Hirabayashi, Akira ; Ogawa, Hidemitsu
Author_Institution :
Dept. of Comput. Sci., Tokyo Inst. of Technol., Japan
Abstract :
Multilayer feedforward neural networks are trained using the error backpropagation (BP) algorithm. This algorithm minimizes the error between outputs of a neural network (NN) and training data. Hence, in the case of noisy training data, a trained network memorizes noisy outputs for given inputs. Such learning is called rote memorization learning (RML). In this paper we propose error correcting memorization learning (CML). It can suppress noise in training data. In order to evaluate generalization ability of CML, it is compared with the projection learning (PL) criterion. It is theoretically proved that although CML merely suppresses noise in training data, it provides the same generalization as PL under some necessary and sufficient condition
Keywords :
backpropagation; feedforward neural nets; generalisation (artificial intelligence); minimisation; multilayer perceptrons; noise; BP algorithm; CML; PL criterion; RML; error backpropagation; error correcting memorization learning; error minimization; generalization; multilayer feedforward neural networks; necessary and sufficient condition; neural network outputs; noise suppression; projection learning criterion; rote memorization learning; training data; Computer errors; Computer science; Error correction; Feedforward neural networks; Feedforward systems; Intelligent networks; Multi-layer neural network; Neural networks; Training data; Vectors;
Conference_Titel :
Neural Networks Proceedings, 1998. IEEE World Congress on Computational Intelligence. The 1998 IEEE International Joint Conference on
Conference_Location :
Anchorage, AK
Print_ISBN :
0-7803-4859-1
DOI :
10.1109/IJCNN.1998.687208