Title :
On model misspecification and KL separation for Gaussian graphical models
Author :
Varun Jog;Po-Ling Loh
Author_Institution :
Department of EECS, University of California at Berkeley, 94720, USA
fDate :
6/1/2015 12:00:00 AM
Abstract :
We establish bounds on the KL divergence between two multivariate Gaussian distributions in terms of the Hamming distance between the edge sets of the corresponding graphical models. We show that the KL divergence is bounded below by a constant when the graphs differ by at least one edge; this is essentially the tightest possible bound, since classes of graphs exist for which the edge discrepancy increases but the KL divergence remains bounded above by a constant. As a natural corollary to our KL lower bound, we also establish a sample size requirement for correct model selection via maximum likelihood estimation. Our results rigorize the notion that it is essential to estimate the edge structure of a Gaussian graphical model accurately in order to approximate the true distribution to close precision.
Keywords :
"Graphical models","Covariance matrices","Estimation","Gaussian distribution","Hamming distance","Mutual information","Parameter estimation"
Conference_Titel :
Information Theory (ISIT), 2015 IEEE International Symposium on
Electronic_ISBN :
2157-8117
DOI :
10.1109/ISIT.2015.7282640