DocumentCode :
3249444
Title :
Back propagation learning equations from the minimization of recursive error
Author :
Simon, Wayne E. ; Carter, Jeffrey R.
Author_Institution :
Martin Marietta Astronaut. Group, Denver, CO, USA
fYear :
1989
fDate :
0-0 1989
Firstpage :
155
Lastpage :
160
Abstract :
A backpropagation learning technique is developed which insures that the network is never far from a solution. It is shown how the concept of minimizing recursive mean square error can be applied under the special restriction of the neural network: changes to the value of a connection require only information about the two nodes which it connects. Use of an order-of-magnitude argument to discard the off-diagonal elements of the second derivative matrix and careful definition of generalized error make it possible for each node to be independent, requiring only knowledge of the node state of nodes connected to it and knowledge of the error and derivative of nodes to which it is connected. Results for a simple exclusive-OR exhibit robust learning at a rate about 100 times faster than conventional backpropagation learning. All problems tried are asymptotic in less than 25 epochs.<>
Keywords :
learning systems; neural nets; backpropagation learning technique; generalized error; learning equations; minimization; neural network; recursive error; robust learning; second derivative matrix; Learning systems; Neural networks;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Systems Engineering, 1989., IEEE International Conference on
Conference_Location :
Fairborn, OH, USA
Type :
conf
DOI :
10.1109/ICSYSE.1989.48643
Filename :
48643
Link To Document :
بازگشت