Title :
NNGD algorithm for neural adaptive filters
Author_Institution :
Sch. of Inf. Syst., East Anglia Univ., Norwich, UK
fDate :
4/27/2000 12:00:00 AM
Abstract :
A novel normalised nonlinear gradient descent (NNGD) algorithm for training neural adaptive feedforward filters is presented. The algorithm is based on minimisation of the instantaneous prediction error for contractive activation functions of a neuron, and provides an adaptive learning rate. Normalisation is performed via calculation of the product of the tap input power to the filter and the squared first derivative of the activation function of a neuron. The NNGD algorithm outperforms a gradient based algorithm for use in a neural adaptive filter, as well as the standard least mean squares (LMS) and normalised LMS algorithms. To support the analysis, simulation results on real speech are provided
Keywords :
adaptive filters; feedforward neural nets; filtering theory; gradient methods; learning (artificial intelligence); nonlinear filters; speech processing; activation function; neural adaptive feedforward filter; normalised nonlinear gradient descent algorithm; prediction error minimisation; speech processing; training;
Journal_Title :
Electronics Letters
DOI :
10.1049/el:20000631