Title :
Robust back-propagation: connectionist learning algorithms resistant to various types of noise
Author :
Movellan, Javier R.
Author_Institution :
Dept. of Psychol., California Univ., Berkeley, CA, USA
Abstract :
Summary form only given, as follows. The author explores the concept of influence functions and their application in backpropagation learning. One reason for backpropagation´s popularity is the fact that it is capable of fitting functions involving high-order interactions. Most real problems not only require detecting nonlinear interactions, but in many cases involve fitting relationships which are contaminated by diverse noise conditions. Typically, backpropagation learning is achieved by minimizing the sum of squared errors. It is well known that least-squares estimators are maximally efficient when the noise is Gaussian but that their performance rapidly deteriorates under non-Gaussian noise. The author´s objective is to make backpropagation resistant to various types of noise. This approach uses objective functions which, in other contexts, are known to be more robust than least squares. Simulations show that backpropagation with robust objective functions indeed learns well and generates solutions that are resistant to a wide variety of noise distributions.<>
Keywords :
learning systems; neural nets; backpropagation learning; connectionist learning algorithms; diverse noise conditions; influence functions; nonGaussian noise; nonlinear interactions; robust objective functions; squared errors; Learning systems; Neural networks;
Conference_Titel :
Neural Networks, 1989. IJCNN., International Joint Conference on
Conference_Location :
Washington, DC, USA
DOI :
10.1109/IJCNN.1989.118530