DocumentCode
1246087
Title
Optimizing the kernel in the empirical feature space
Author
Xiong, Huilin ; Swamy, M.N.S. ; Ahmad, M. Omair
Author_Institution
Dept. of Electr. & Comput. Eng., Concordia Univ., Montreal, Que., Canada
Volume
16
Issue
2
fYear
2005
fDate
3/1/2005 12:00:00 AM
Firstpage
460
Lastpage
474
Abstract
In this paper, we present a method of kernel optimization by maximizing a measure of class separability in the empirical feature space, an Euclidean space in which the training data are embedded in such a way that the geometrical structure of the data in the feature space is preserved. Employing a data-dependent kernel, we derive an effective kernel optimization algorithm that maximizes the class separability of the data in the empirical feature space. It is shown that there exists a close relationship between the class separability measure introduced here and the alignment measure defined recently by Cristianini. Extensive simulations are carried out which show that the optimized kernel is more adaptive to the input data, and leads to a substantial, sometimes significant, improvement in the performance of various data classification algorithms.
Keywords
feature extraction; learning (artificial intelligence); optimisation; Euclidean space; class separability; data dependent kernel; empirical feature space; kernel optimization; Classification algorithms; Kernel; Machine learning; Machine learning algorithms; Optimization methods; Pattern recognition; Principal component analysis; Signal processing algorithms; Support vector machines; Training data; Class separability; data classification; empirical feature space; feature space; kernel machines; kernel optimization; Empirical Research; Neural Networks (Computer);
fLanguage
English
Journal_Title
Neural Networks, IEEE Transactions on
Publisher
ieee
ISSN
1045-9227
Type
jour
DOI
10.1109/TNN.2004.841784
Filename
1402506
Link To Document