Title :
The e-PCA and m-PCA: dimension reduction of parameters by information geometry
Author_Institution :
Neurosci. Res. Inst., Nat. Inst. of Adv. Ind. Sci. & Technol., Tsukuba, Japan
Abstract :
We propose a method for extracting a low dimensional structure from a set of parameters of probability distributions. By an information geometrical interpretation, we show that there exist two kinds of possible flat structures for fitting (e-PCA and m-PCA). We derive alternating procedures to find the low dimensional structures. Each alternating procedure can be written in a nonlinear equation. It can be solved analytically in some special cases. Otherwise, we need to apply gradient type methods that we also derive. Since the overall algorithm may converge to a local optimum, we propose a method to find a good initial solution by using metric information.
Keywords :
geometry; gradient methods; nonlinear equations; principal component analysis; probability; gradient type methods; information geometry; metric information; nonlinear equation; parameter dimension reduction; principal component analysis; probability distributions; Data mining; Fitting; Gaussian distribution; Information geometry; Kernel; Neuroscience; Nonlinear equations; Principal component analysis; Probability distribution; Stochastic processes;
Conference_Titel :
Neural Networks, 2004. Proceedings. 2004 IEEE International Joint Conference on
Print_ISBN :
0-7803-8359-1
DOI :
10.1109/IJCNN.2004.1379884