Title :
Similarity Learning in Nearest Neighbor and Relief Algorithm
Author :
Qamar, Ali Mustafa ; Gaussier, Eric
Author_Institution :
Lab. d´´Inf. de Grenoble, Univ. de Grenoble, Grenoble, France
Abstract :
In this paper, we study the links between RELIEF, a well-known feature re-weighting algorithm and SiLA, a similarity learning algorithm. On one hand, SiLA is interested in directly reducing the leave-one-out error or 0-1 loss by reducing the number of mistakes on unseen examples. On the other hand, it has been shown that RELIEF could be seen as a distance learning algorithm in which a linear utility function with maximum margin was optimized. We first propose here a version of this algorithm for similarity learning, called RBS (for RELIEF-Based Similarity learning). As RELIEF, and unlike SiLA, RBS does not try to optimize the leave-one-out error or 0-1 loss, and does not perform very well in practice, as we illustrate on two UCI collections. We thus introduce a stricter version of RBS, called sRBS, aiming at relying on a cost function closer to the 0-1 loss. Experiments conducted on several datasets illustrate the different behaviors of these algorithms for learning similarities for kNN classification. The results indicate in particular that the 0-1 loss is a more appropriate cost function than the one implicitly used by RELIEF.
Keywords :
learning (artificial intelligence); pattern classification; 0-1 loss; RELIEF-based similarity learning; SiLA; cost function closer; distance learning; feature reweighting algorithm; kNN classification; learning similarities; leave-one-out error; linear utility function; maximum margin; nearest neighbor; relief algorithm; Cost function; Equations; Machine learning algorithms; Nearest neighbor searches; Support vector machine classification; Symmetric matrices; Training; RELIEF algorithm; SiLA algorithm; kNN classification; machine learning; similarity learning;
Conference_Titel :
Machine Learning and Applications (ICMLA), 2010 Ninth International Conference on
Conference_Location :
Washington, DC
Print_ISBN :
978-1-4244-9211-4
DOI :
10.1109/ICMLA.2010.34