Title :
A robust approach to supervised learning in neural network
Author_Institution :
Dept. of Electr. & Comput. Eng., Texas Univ., Austin, TX, USA
fDate :
27 Jun-2 Jul 1994
Abstract :
Most supervised neural networks (NN) are trained by minimizing the mean squared errors (MSE) of the training set. In the presence of outliers, the resulting NN model can differ significantly from the underlying system that generates the data. In order to handle outliers, this study proposes to minimize the mean log squared errors (MLSE), an approach which is easily adapted to most supervised learning algorithms. Simulation results indicate that this proposal is robust against outliers
Keywords :
error analysis; learning (artificial intelligence); least mean squares methods; minimisation; neural nets; mean log squared errors; mean squared errors; neural network; outliers; supervised learning; Computer architecture; Intelligent networks; Least squares approximation; Maximum likelihood estimation; Neural networks; Proposals; Radio access networks; Robustness; Supervised learning; Training data;
Conference_Titel :
Neural Networks, 1994. IEEE World Congress on Computational Intelligence., 1994 IEEE International Conference on
Conference_Location :
Orlando, FL
Print_ISBN :
0-7803-1901-X
DOI :
10.1109/ICNN.1994.374216