Title :
Fast Kernel Sparse Representation Approaches for Classification
Author :
Yifeng Li ; Ngom, Alioune
Author_Institution :
Sch. of Comput. Sci., Univ. of Windsor, Windsor, ON, Canada
Abstract :
Sparse representation involves two relevant procedures - sparse coding and dictionary learning. Learning a dictionary from data provides a concise knowledge representation. Learning a dictionary in a higher feature space might allow a better representation of a signal. However, it is usually computationally expensive to learn a dictionary if the numbers of training data and(or) dimensions are very large using existing algorithms. In this paper, we propose a kernel dictionary learning framework for three models. We reveal that the optimization has dimension-free and parallel properties. We devise fast active-set algorithms for this framework. We investigated their performance on classification. Experimental results show that our kernel sparse representation approaches can obtain better accuracy than their linear counterparts. Furthermore, our active-set algorithms are faster than the existing interior-point and proximal algorithms.
Keywords :
data structures; knowledge representation; learning (artificial intelligence); set theory; signal classification; signal representation; classification performance; dictionary learning procedure; fast active-set algorithm; interior-point algorithm; kernel dictionary learning framework; kernel sparse representation approach; knowledge representation; proximal algorithm; signal representation; sparse coding procedure; Accuracy; Dictionaries; Equations; Kernel; Mathematical model; Optimization; Training; $l_1$ regularization; dictionary learning; kernel sparse representation; non-negative least squares; sparse coding;
Conference_Titel :
Data Mining (ICDM), 2012 IEEE 12th International Conference on
Conference_Location :
Brussels
Print_ISBN :
978-1-4673-4649-8
DOI :
10.1109/ICDM.2012.133