Title :
Sparse covariance estimation under Kronecker product structure
Author :
Tsiligkaridis, Theodoros ; Hero, Alfred O.
Author_Institution :
EECS Dept., Univ. of Michigan, Ann Arbor, MI, USA
Abstract :
We introduce a sparse covariance estimation method for the high dimensional setting when the covariance matrix decomposes as a Kronecker product, i.e., Σ0 = A0 ⊗ B0, and the observations are Gaussian. We propose an ℓ1 penalized maximum-likelihood approach to solve this problem. The dual formulation motivates an iterative algorithm (penalized flip-flop; FFP) based on a block coordinate-descent approach. Although the ℓ1-penalized log-likelihood function (objective function) is non-convex in general and non-smooth, we show that FFP converges to a local maximum under relatively mild assumptions. For the fixed dimension case, large-sample statistical consistency is proved and a rate of convergence bound is derived. Simulations show that FFP outperforms its non-penalized counterpart and the naive Glasso algorithm for sparse Kronecker-decomposable covariance matrix.
Keywords :
Gaussian processes; covariance matrices; iterative methods; maximum likelihood estimation; sparse matrices; ℓ1-penalized log-likelihood function; FFP; Gaussian observation; Kronecker product structure; block coordinate-descent approach; covariance matrix decomposition; iterative algorithm; large-sample statistical consistency; maximum-likelihood approach; naive Glasso algorithm; sparse Kronecker-decomposable covariance matrix; sparse covariance estimation method; Biological system modeling; Convergence; Covariance matrix; Educational institutions; Maximum likelihood estimation; Sparse matrices; Glasso; direct product; dual optimization; high dimensional inference; penalized maximum likelihood;
Conference_Titel :
Acoustics, Speech and Signal Processing (ICASSP), 2012 IEEE International Conference on
Conference_Location :
Kyoto
Print_ISBN :
978-1-4673-0045-2
Electronic_ISBN :
1520-6149
DOI :
10.1109/ICASSP.2012.6288703