Title :
An adaptive step size for backpropagation using linear lower bounding functions
Author :
Yu, Mao ; Chang, Tsu-Shuan
Author_Institution :
Dept. of Electr. & Comput. Eng., California Univ., Davis, CA, USA
fDate :
5/1/1995 12:00:00 AM
Abstract :
An adaptive step size is presented for the backpropagation algorithm in feedforward neural nets using linear lower bounding functions. Basically, a linear lower bounding function (LLBF) for a given function over an interval is a linear function that lies below the given function and matches the original function value at one end point. To search for an adaptive step size, an LLBF for the error function, which is expressed in terms of the step size, is derived. Since the error in a neural net can never be smaller than zero, it is plausible not to take a step larger than the step size when the associated LLBF reaches zero. In the paper, an adaptive learning algorithm based on the above idea is given. Numerical examples are used to illustrate its feasibility and to compare it with some previous results
Keywords :
adaptive signal processing; backpropagation; feedforward neural nets; search problems; adaptive learning algorithm; adaptive step size; backpropagation; backpropagation algorithm; error function; feedforward neural nets; linear lower bounding functions; Adaptive algorithm; Backpropagation algorithms; Convergence; Feedforward neural networks; Helium; Neural networks; Signal processing algorithms;
Journal_Title :
Signal Processing, IEEE Transactions on