DocumentCode :
155684
Title :
Learning and storing the parts of objects: IMF
Author :
de Frein, Ruairi
Author_Institution :
Telecommun. Software & Syst. Group, Ireland
fYear :
2014
fDate :
21-24 Sept. 2014
Firstpage :
1
Lastpage :
6
Abstract :
A central concern for many learning algorithms is how to efficiently store what the algorithm has learned. An algorithm for the compression of Nonnegative Matrix Factorizations is presented. Compression is achieved by embedding the factorization in an encoding routine. Its performance is investigated using two standard test images, Peppers and Barbara. The compression ratio (18:1) achieved by the proposed Matrix Factorization improves the storage-ability of Nonnegative Matrix Factorizations without significantly degrading accuracy (≈ 1-3dB degradation is introduced). We learn as before, but storage is cheaper.
Keywords :
matrix decomposition; signal processing; Barbara; IMF; Peppers; compression ratio; learning algorithm; nonnegative matrix factorization; Approximation methods; Dictionaries; Encoding; Quantization (signal); Signal processing algorithms; Signal to noise ratio; Vectors; compression; matrix factorization;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Machine Learning for Signal Processing (MLSP), 2014 IEEE International Workshop on
Conference_Location :
Reims
Type :
conf
DOI :
10.1109/MLSP.2014.6958926
Filename :
6958926
Link To Document :
بازگشت