DocumentCode
16685
Title
Sketching Sparse Matrices, Covariances, and Graphs via Tensor Products
Author
Dasarathy, Gautam ; Shah, Parikshit ; Bhaskar, Badri Narayan ; Nowak, Robert D.
Author_Institution
Dept. of Electr. & Comput. Eng., Univ. of Wisconsin, Madison, WI, USA
Volume
61
Issue
3
fYear
2015
fDate
Mar-15
Firstpage
1373
Lastpage
1388
Abstract
This paper considers the problem of recovering an unknown sparse p×p matrix X from an m×m matrix Y=AXBT, where A and B are known m×p matrices with m≪p. The main result shows that there exist constructions of the sketching matrices A and B so that even if X has O(p) nonzeros, it can be recovered exactly and efficiently using a convex program as long as these nonzeros are not concentrated in any single row/column of X. Furthermore, it suffices for the size of Y (the sketch dimension) to scale as m = O(√(# nonzeros in X) × log p). The results also show that the recovery is robust and stable in the sense that if X is equal to a sparse matrix plus a perturbation, then the convex program we propose produces an approximation with accuracy proportional to the size of the perturbation. Unlike traditional results on sparse recovery, where the sensing matrix produces independent measurements, our sensing operator is highly constrained (it assumes a tensor product structure). Therefore, proving recovery guarantees require nonstandard techniques. Indeed, our approach relies on a novel result concerning tensor products of bipartite graphs, which may be of independent interest. This problem is motivated by the following application, among others. Consider a p×n data matrix D, consisting of n observations of p variables. Assume that the correlation matrix X:=DDT is (approximately) sparse in the sense that each of the p variables is significantly correlated with only a few others. Our results show that these significant correlations can be detected even if we have access to only a sketch of the data S=AD with A ∈ Rm×p .
Keywords
approximation theory; compressed sensing; graph theory; matrix algebra; tensors; approximation; bipartite graph; convex program; covariance sketching; graph sketching; sparse matrix recovery; sparse matrix sketching; tensor products; Approximation methods; Correlation; Covariance matrices; Optimization; Sparse matrices; Tensile stress; Vectors; $ell _{1}$ minimization; 1 minimization; Sketching; compressed sensing; covariance sketching; distributed sparsity; graph sketching; multi-dimensional signal processing; sketching; tensor products;
fLanguage
English
Journal_Title
Information Theory, IEEE Transactions on
Publisher
ieee
ISSN
0018-9448
Type
jour
DOI
10.1109/TIT.2015.2391251
Filename
7008533
Link To Document