Title :
Comparing measures of sparsity
Author :
Hurley, Niall ; Rickard, Scott
Author_Institution :
CASL, Univ. Coll. Dublin, Dublin
Abstract :
Sparsity is a recurrent theme in machine learning and is used to improve performance of algorithms such as non-negative matrix factorization and the LOST algorithm. Our aim in this paper is to compare several commonly-used sparsity measures according to intuitive attributes that a sparsity measure should have. Sparsity of representations of signals in fields such as blind source separation, compression, sampling and signal analysis has proved not just to be useful but a key factor in the success of algorithms in these areas. Intuitively, a sparse representation is one in which a small number of coefficients contain a large proportion of the energy. In this paper we discuss six properties (robin hood, scaling, rising tide, cloning, bill gates and babies) that we believe a sparsity measure should have. The main contribution of this paper is a table which classifies commonly-used sparsity measures based on whether or not they satisfy these six propositions. Only one of these measures satisfies all six: the Gini index.
Keywords :
learning (artificial intelligence); matrix decomposition; signal representation; statistical distributions; Gini index; LOST algorithm; babies property; bill gates property; blind source separation; cloning property; machine learning; nonnegative matrix factorization; rising tide property; robin hood property; scaling property; signal analysis; signal compression; signal representation; signal sampling; sparsity measure; statistical distribution; Blind source separation; Educational institutions; Loss measurement; Machine learning; Machine learning algorithms; Sampling methods; Signal analysis; Source separation; Tides; User centered design;
Conference_Titel :
Machine Learning for Signal Processing, 2008. MLSP 2008. IEEE Workshop on
Conference_Location :
Cancun
Print_ISBN :
978-1-4244-2375-0
Electronic_ISBN :
1551-2541
DOI :
10.1109/MLSP.2008.4685455