DocumentCode :
2790045
Title :
Kernel width adaptation in information theoretic cost functions
Author :
Singh, Abhishek ; Príncipe, Jose C.
Author_Institution :
Comput. NeuroEngineering Lab., Univ. of Florida, Gainesville, FL, USA
fYear :
2010
fDate :
14-19 March 2010
Firstpage :
2062
Lastpage :
2065
Abstract :
This paper presents an algorithm for online adaptation of the kernel width parameter in information theoretic cost functions used for adaptive system training. Training algorithms which optimize information theoretic quantities like entropy involve choosing a kernel size for their sample estimators. The kernel size essentially dictates the nature of the performance surface of the cost function over which the system parameters adapt. This, in turn, governs factors like speed of adaptation and presence of local minima. We show results of using the Minimum Error Entropy (MEE) criterion with the proposed adaptive kernel algorithm for training a time delay neural network. Our simulations show that having an adaptive kernel width results in faster convergence of parameters as compared to having fixed values.
Keywords :
adaptive systems; costing; learning (artificial intelligence); minimum entropy methods; neural nets; adaptive system training; information theoretic cost functions; kernel width adaptation; minimum error entropy; time delay neural network; training algorithms; Adaptive filters; Adaptive systems; Convergence; Cost function; Delay effects; Entropy; Kernel; Neural networks; Signal processing algorithms; Statistics; Information Theoretic Learning; Kernel Width Selection; Kullback-Leibler Divergence; Time Delay Neural Network;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Acoustics Speech and Signal Processing (ICASSP), 2010 IEEE International Conference on
Conference_Location :
Dallas, TX
ISSN :
1520-6149
Print_ISBN :
978-1-4244-4295-9
Electronic_ISBN :
1520-6149
Type :
conf
DOI :
10.1109/ICASSP.2010.5495035
Filename :
5495035
Link To Document :
بازگشت