DocumentCode :
1339967
Title :
NNGD algorithm for neural adaptive filters
Author :
Mandic, D.P.
Author_Institution :
Sch. of Inf. Syst., East Anglia Univ., Norwich, UK
Volume :
36
Issue :
9
fYear :
2000
fDate :
4/27/2000 12:00:00 AM
Firstpage :
845
Lastpage :
846
Abstract :
A novel normalised nonlinear gradient descent (NNGD) algorithm for training neural adaptive feedforward filters is presented. The algorithm is based on minimisation of the instantaneous prediction error for contractive activation functions of a neuron, and provides an adaptive learning rate. Normalisation is performed via calculation of the product of the tap input power to the filter and the squared first derivative of the activation function of a neuron. The NNGD algorithm outperforms a gradient based algorithm for use in a neural adaptive filter, as well as the standard least mean squares (LMS) and normalised LMS algorithms. To support the analysis, simulation results on real speech are provided
Keywords :
adaptive filters; feedforward neural nets; filtering theory; gradient methods; learning (artificial intelligence); nonlinear filters; speech processing; activation function; neural adaptive feedforward filter; normalised nonlinear gradient descent algorithm; prediction error minimisation; speech processing; training;
fLanguage :
English
Journal_Title :
Electronics Letters
Publisher :
iet
ISSN :
0013-5194
Type :
jour
DOI :
10.1049/el:20000631
Filename :
843810
Link To Document :
بازگشت