Title :
Learning Isometric Separation Maps
Author :
Vasiloglou, Nikolaos ; Gray, Alexander G. ; Anderson, David V.
Author_Institution :
Georgia Inst. of Technol., Atlanta, GA, USA
Abstract :
Maximum Variance Unfolding (MVU) and its variants have been very successful in embedding data-manifolds in lower dimensional spaces, often revealing the true intrinsic dimension. In this paper we show how to also incorporate supervised class information into an MVU-like method without breaking its convexity. We call this method the Isometric Separation Map and we show that the resulting kernel matrix can be used as a binary/multiclass Support Vector Machine-like method in a semi-supervised (transductive) framework. We also show that the method always finds a kernel matrix that linearly separates the training data exactly without projecting them in infinite dimensional spaces. In traditional SVMs we choose a kernel and hope that the data become linearly separable in the kernel space. In this paper we show how the hyperplane can be chosen ad-hoc and the kernel is trained so that data are always linearly separable. Comparisons with Large Margin SVMs show comparable performance.
Keywords :
convex programming; learning (artificial intelligence); operating system kernels; pattern classification; support vector machines; convexity; data-manifolds; hyperplane; infinite dimensional space; isometric separation map; isometric separation maps; kernel matrix; maximum variance unfolding; supervised class information; support vector machine; Bandwidth; Hilbert space; Kernel; Laplace equations; Learning systems; Space technology; Support vector machine classification; Support vector machines; Testing; Training data;
Conference_Titel :
Machine Learning for Signal Processing, 2009. MLSP 2009. IEEE International Workshop on
Conference_Location :
Grenoble
Print_ISBN :
978-1-4244-4947-7
Electronic_ISBN :
978-1-4244-4948-4
DOI :
10.1109/MLSP.2009.5306212