DocumentCode :
2061178
Title :
New operators for fixed-point theory: The sparsity-aware learning case
Author :
Slavakis, Konstantinos ; Kopsinis, Yannis ; Theodoridis, S.
Author_Institution :
Digital Technol. Center, Univ. of Minnesota, Minneapolis, MN, USA
fYear :
2013
fDate :
9-13 Sept. 2013
Firstpage :
1
Lastpage :
5
Abstract :
The present paper offers a link between fixed point theory and thresholding; one of the key enablers in sparsity-promoting algorithms, associated mostly with non-convex penalizing functions. A novel family of operators, the partially quasi-nonexpansive mappings, is introduced to provide the necessary theoretical foundations. Based on such fixed point theoretical ground, and motivated by hard thresholding, the generalized thresholding (GT) mapping is proposed that encompasses hard, soft, as well as recent advances of thresholding rules. GT is incorporated into an online/time-adaptive algorithm of linear complexity that demonstrates competitive performance with respect to computationally thirstier, state-of-the-art, RLS- and proportionate-type sparsity-aware methods.
Keywords :
adaptive filters; compressed sensing; concave programming; fixed point arithmetic; learning (artificial intelligence); fixed-point theory; generalized thresholding mapping; hard thresholding rules; linear complexity; non-convex penalizing functions; online/time-adaptive algorithm; partially quasi-nonexpansive mappings; proportionate-type sparsity-aware methods; soft thresholding rules; sparsity-aware learning case; sparsity-promoting algorithms; Abstracts; Ions; Lead; Thresholding; adaptive filtering; fixed point theory; sparsity;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Signal Processing Conference (EUSIPCO), 2013 Proceedings of the 21st European
Conference_Location :
Marrakech
Type :
conf
Filename :
6811737
Link To Document :
بازگشت