DocumentCode :
1748944
Title :
Independent components analysis using Renyi´s mutual information and Legendre density estimation
Author :
Erdogmus, Deniz ; Hild, Kenneth E., II ; Principe, Jose C.
Author_Institution :
Comput. Neuroeng. Lab., Florida Univ., Gainesville, FL, USA
Volume :
4
fYear :
2001
fDate :
2001
Firstpage :
2762
Abstract :
We have previously proposed the use of quadratic Renyi´s mutual information (1970), estimated using Parzen windowing, as an ICA criterion and showed that it utilizes data more efficiently than classical algorithms like InfoMax and FastICA. We suggested the use of Renyi´s definition of information theoretic quantities rather than Shannon´s definitions since Shannon´s definitions are already included in Renyi´s as special cases. In the estimation of probability densities using kernel methods, the choice of the kernel width is an important issue that affects the overall performance of the system, and there is no known way of determining the optimal value. Legendre polynomial expansion of a probability distribution, on the other hand, has two advantages. Hardware implementation is trivial and it does not require the choice of any parameter except for the point of truncation of the series. The rule for this assignment is simple: the longer the series, the more accurate the density estimation becomes. Thus, we combine these two schemes, namely Renyi´s entropy and Legendre polynomial expansion for probability density function estimation to obtain a simple ICA algorithm. This algorithm is then tested on blind source separation, time-series analysis, and data reduction
Keywords :
principal component analysis; ICA criterion; Legendre density estimation; Legendre polynomial expansion; Parzen windowing; blind source separation; data reduction; independent components analysis; kernel width; probability density estimation; probability distribution; quadratic Renyi mutual information; time-series analysis; Blind source separation; Entropy; Hardware; Independent component analysis; Kernel; Mutual information; Polynomials; Probability density function; Probability distribution; Testing;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 2001. Proceedings. IJCNN '01. International Joint Conference on
Conference_Location :
Washington, DC
ISSN :
1098-7576
Print_ISBN :
0-7803-7044-9
Type :
conf
DOI :
10.1109/IJCNN.2001.938810
Filename :
938810
Link To Document :
بازگشت