Title :
A global gradient-noise covariance expression for stationary real Gaussian inputs
Author :
An, P. Edgar ; Brown, Martin ; Harris, C.J.
Author_Institution :
Dept. of Electron. & Comput. Sci., Southampton Univ., UK
fDate :
11/1/1995 12:00:00 AM
Abstract :
Supervised parameter adaptation in many artificial neural networks is largely based on an instantaneous version of gradient descent called the least-mean-square (LMS) algorithm. This paper considers only neural models which are linear with respect to their adaptable parameters and has two major contributions. First, it derives an expression for the gradient-noise covariance under the assumption that the input samples are real, stationary, Gaussian distributed but can be partially correlated. This expression relates the gradient correlation and input correlation matrices to the gradient-noise covariance and explains why the gradient noise generally correlates maximally with the steepest principal axis and minimally with the one of the smallest curvature, regardless of the magnitude of the weight error. Second, a recursive expression for the weight-error correlation matrix is derived in a straightforward manner using the gradient-noise covariance, and comparisons are drawn with the complex LMS algorithm
Keywords :
Gaussian processes; correlation methods; covariance analysis; least mean squares methods; matrix algebra; neural nets; gradient correlation matrix; gradient-noise covariance; input correlation matrix; least-mean-square; neural networks; partial correlation; stationary real Gaussian inputs; steepest principal axis; supervised parameter adaptation; weight-error correlation matrix; Algorithm design and analysis; Artificial neural networks; Convergence; Covariance matrix; Gaussian noise; Iterative algorithms; Large-scale systems; Least squares approximation; Parameter estimation; Training data;
Journal_Title :
Neural Networks, IEEE Transactions on