DocumentCode :
2506983
Title :
A neural nonlinear adaptive filter with a trainable activation function
Author :
Goh, Su Lee ; Mandic, Danilo P. ; Bozic, Milorad
Author_Institution :
Dept. of Electr. & Electron. Eng., Imperial Coll. of Sci., Technol. & Med., London, UK
fYear :
2002
fDate :
2002
Firstpage :
7
Lastpage :
10
Abstract :
The normalized nonlinear gradient descent learning algorithm (NNGD) for a class of nonlinear finite impulse response (FIR) adaptive filters (dynamical perceptron) is extended to the case where the amplitude of the nonlinear activation function is made gradient adaptive. This makes the adaptive amplitude normalized nonlinear gradient descent (AANNGD) algorithm. The AANNGD is suitable for processing of nonlinear and nonstationary signals with a large dynamical range. Experimental results show that AANNGD outperforms the standard LMS, NGD, NNGD, the fully adaptive (FANNGD) and the sign algorithm on nonlinear input with large dynamics.
Keywords :
FIR filters; adaptive filters; gradient methods; neural nets; nonlinear filters; adaptive amplitude normalized nonlinear gradient descent algorithm; dynamical perceptron; large dynamical range; neural nonlinear adaptive filter; nonlinear activation function; nonlinear finite impulse response adaptive filters; nonlinear signal processing; nonstationary signal processing; normalized nonlinear gradient descent learning algorithm; trainable activation function; Adaptive estimation; Adaptive filters; Convergence; Educational institutions; Filtering; Finite impulse response filter; Least squares approximation; Nonlinear equations; Signal processing; Taylor series;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Network Applications in Electrical Engineering, 2002. NEUREL '02. 2002 6th Seminar on
Print_ISBN :
0-7803-7593-9
Type :
conf
DOI :
10.1109/NEUREL.2002.1057957
Filename :
1057957
Link To Document :
بازگشت