DocumentCode :
3688661
Title :
Maximizing margin quality and quantity
Author :
Yuanzhe Bei;Pengyu Hong
Author_Institution :
Computer Science Department, Brandeis University, 415 South St. Waltham, MA 02453, USA
fYear :
2015
Firstpage :
1
Lastpage :
6
Abstract :
The large-margin principle has been widely applied to learn classifiers with good generalization power. While tremendous efforts have been devoted to develop machine learning techniques that maximize margin quantity, little attention has been paid to ensure the margin quality. In this paper, we proposed a new framework that aims to achieve superior generalizability by considering not only margin quantity but also margin quality. An instantiation of the framework was derived by deploying a max-min entropy principle to maximize margin-quality in addition to using a traditional means for maximizing margin-quantity. We developed an iterative learning algorithm to solve this instantiation. We compared the algorithm with a couple of widely-used machine learning techniques (e.g., Support Vector Machines, decision tree, naive Bayes classifier, k-nearest neighbors, etc.) and several other large margin learners (e.g., RELIEF, Simba, G-flip, LOGO, etc.) on a number of UCI machine learning datasets and gene expression datasets. The results demonstrated the effectiveness of our new framework and algorithm.
Keywords :
"Classification algorithms","Training","Cost function","Kernel","Gene expression","Entropy","Support vector machines"
Publisher :
ieee
Conference_Titel :
Machine Learning for Signal Processing (MLSP), 2015 IEEE 25th International Workshop on
Type :
conf
DOI :
10.1109/MLSP.2015.7324382
Filename :
7324382
Link To Document :
بازگشت