Title :
Kernel adaptive filtering with maximum correntropy criterion
Author :
Zhao, Songlin ; Chen, Badong ; Príncipe, José C.
Author_Institution :
Dept. of Electr. & Comput. Eng., Univ. of Florida, Gainesville, FL, USA
fDate :
July 31 2011-Aug. 5 2011
Abstract :
Kernel adaptive filters have drawn increasing attention due to their advantages such as universal nonlinear approximation with universal kernels, linearity and convexity in Reproducing Kernel Hilbert Space (RKHS). Among them, the kernel least mean square (KLMS) algorithm deserves particular attention because of its simplicity and sequential learning approach. Similar to most conventional adaptive filtering algorithms, the KLMS adopts the mean square error (MSE) as the adaptation cost. However, the mere second-order statistics is often not suitable for nonlinear and non-Gaussian situations. Therefore, various non-MSE criteria, which involve higher-order statistics, have received an increasing interest. Recently, the correntropy, as an alternative of MSE, has been successfully used in nonlinear and non-Gaussian signal processing and machine learning domains. This fact motivates us in this paper to develop a new kernel adaptive algorithm, called the kernel maximum correntropy (KMC), which combines the advantages of the KLMS and maximum correntropy criterion (MCC). We also study its convergence and self-regularization properties by using the energy conservation relation. The superior performance of the new algorithm has been demonstrated by simulation experiments in the noisy frequency doubling problem.
Keywords :
Hilbert spaces; adaptive filters; least mean squares methods; maximum entropy methods; higher-order statistics; kernel adaptive filtering; kernel least mean square algorithm; kernel maximum correntropy; maximum correntropy criterion; mean square error; reproducing kernel Hilbert space; second-order statistics; sequential learning approach; universal kernels; universal nonlinear approximation; Approximation algorithms; Approximation methods; Cost function; Kernel; Noise; Random variables; Robustness;
Conference_Titel :
Neural Networks (IJCNN), The 2011 International Joint Conference on
Conference_Location :
San Jose, CA
Print_ISBN :
978-1-4244-9635-8
DOI :
10.1109/IJCNN.2011.6033473