DocumentCode :
2172752
Title :
Local linear approximation of principal curve projections
Author :
Zhang, Peng ; Ataer-Cansizoglu, Esra ; Erdogmus, Deniz
Author_Institution :
Cognitive Syst. Lab., Northeastern Univ., Boston, MA, USA
fYear :
2012
fDate :
23-26 Sept. 2012
Firstpage :
1
Lastpage :
6
Abstract :
In previous work we introduced principal surfaces as hyperridges of probability distributions in a differential geometrical sense. Specifically, given an n-dimensional probability distribution over real-valued random vectors, a point on the d-dimensional principal surface is a local maximizer of the distribution in the subspace orthogonal to the principal surface at that point. For twice continuously differentiable distributions, the surface is characterized by the gradient and the Hessian of the distribution. Furthermore, the nonlinear projections of data points to the principal surface for dimension reduction is ideally given by the solution trajectories of differential equations that are initialized at the data point and whose tangent vectors are determined by the Hessian eigenvectors. In practice, data dimension reduction using numerical integration based differential equation solvers are found to be computationally expensive for most machine learning applications. Consequently, in this paper, we propose a local linear approximation to achieve this dimension reduction without significant loss of accuracy while reducing computational complexity. The proposed method is demonstrated on synthetic datasets.
Keywords :
Hessian matrices; approximation theory; computational complexity; differential equations; eigenvalues and eigenfunctions; integration; learning (artificial intelligence); random processes; vectors; Hessian eigenvectors; computational complexity reduction; continuously differentiable distributions; d-dimensional principal surface; data dimension reduction; data point nonlinear projections; local linear approximation; local maximizer; machine learning; n-dimensional probability distribution; numerical integration-based differential equation solvers; principal curve projections; real-valued random vectors; solution trajectories; subspace orthogonal; synthetic datasets; tangent vectors; Kernel; Linear approximation; Machine learning; Manifolds; Principal component analysis; Vectors; Manifold Learning; Nonlinear Dimension Reduction; Principal Curve Projection;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Machine Learning for Signal Processing (MLSP), 2012 IEEE International Workshop on
Conference_Location :
Santander
ISSN :
1551-2541
Print_ISBN :
978-1-4673-1024-6
Electronic_ISBN :
1551-2541
Type :
conf
DOI :
10.1109/MLSP.2012.6349764
Filename :
6349764
Link To Document :
بازگشت