Title :
Learning Markov graphs up to edit distance
Author :
Das, Abhik Kumar ; Netrapalli, Praneeth ; Sanghavi, Sujay ; Vishwanath, Sriram
Author_Institution :
Dept. of ECE, Univ. of Texas at Austin, Austin, TX, USA
Abstract :
This paper presents a rate distortion approach to Markov graph learning. It provides lower bounds on the number of samples required for any algorithm to learn the Markov graph structure of a probability distribution, up to edit distance. We first prove a general result for any probability distribution, and then specialize it for Ising and Gaussian models. In particular, for both Ising and Gaussian models on p variables with degree at most d, we show that at least Ω((d - s/p)log p) samples are required for any algorithm to learn the graph structure up to edit distance s. Our bounds represent a strong converse; i.e., we show that for a lower number of samples, the probability of error goes to 1 as the problem size increases. These results show that substantial gains in sample complexity may not be possible without paying a significant price in edit distance error.
Keywords :
Gaussian processes; Markov processes; graph theory; probability; Gaussian model; Ising model; Markov graph; distance editing; error probability distribution; rate distortion approach; Complexity theory; Computational modeling; Covariance matrix; Graphical models; Markov random fields; Vectors; Gaussian Markov model; Ising model; Markov networks; graphical models; strong converse;
Conference_Titel :
Information Theory Proceedings (ISIT), 2012 IEEE International Symposium on
Conference_Location :
Cambridge, MA
Print_ISBN :
978-1-4673-2580-6
Electronic_ISBN :
2157-8095
DOI :
10.1109/ISIT.2012.6284018