DocumentCode :
699866
Title :
Regularized dictionary learning for sparse approximation
Author :
Yaghoobi, M. ; Blumensath, T. ; Davies, M.
Author_Institution :
Inst. for Digital Commun., Univ. of Edinburgh, Edinburgh, UK
fYear :
2008
fDate :
25-29 Aug. 2008
Firstpage :
1
Lastpage :
5
Abstract :
Sparse signal models approximate signals using a small number of elements from a large set of vectors, called a dictionary. The success of such methods relies on the dictionary fitting the signal structure. Therefore, the dictionary has to be designed to fit the signal class of interest. This paper uses a general formulation that allows the dictionary to be learned form the data with some a priori information about the dictionary. In this formulation a universal cost function is proposed and practical algorithms are presented to minimize this cost under different constraints on the dictionary. The proposed methods are compared with previous approaches using synthetic and real data. Simulations highlight the advantages of the proposed methods over other currently available dictionary learning strategies.
Keywords :
approximation theory; learning (artificial intelligence); signal representation; cost minimization; regularized dictionary learning strategy; signal representations; signal structure; sparse approximation; sparse signal models; universal cost function; Approximation methods; Cost function; Dictionaries; Linear programming; Signal processing algorithms; Vectors;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Signal Processing Conference, 2008 16th European
Conference_Location :
Lausanne
ISSN :
2219-5491
Type :
conf
Filename :
7080398
Link To Document :
بازگشت