Title :
Back-propagation training using a least mean power error function
Author :
Pimmel, Russell L
Abstract :
Summary form only given, as follows. Like many gradient descent algorithms, back-propagation can become trapped in a local minimum which corresponds to a non-optimal network configuration. At a typical local minimum, most outputs are essentially correct with only a few outputs exhibiting gross errors. The authors propose a modified error function in which the output errors are raised to a power larger than the nominal two. This is intended to alleviate the local minimum problem by focusing the training process on the large output errors. Simulation results were obtained for simple computational networks which are prone to local minima
Keywords :
learning systems; neural nets; back-propagation; computational networks; least mean power error function; local minimum; output errors; training process; Algorithm design and analysis; Computational modeling; Computer errors; Computer networks; Humans; Mathematics; Neural networks;
Conference_Titel :
Neural Networks, 1991., IJCNN-91-Seattle International Joint Conference on
Conference_Location :
Seattle, WA
Print_ISBN :
0-7803-0164-1
DOI :
10.1109/IJCNN.1991.155558