Title :
Information for inference
Author :
Xu, Ge ; Chen, Biao
Author_Institution :
Dept. of EECS, Syracuse Univ., Syracuse, NY, USA
Abstract :
Wyner defined the notion of common information of two discrete random variables as the minimum of I(W; X,Y) where W induces conditional independence between X and Y. Its generalization to multiple dependent random variables revealed a surprising monotone property in the number of variables. Motivated by this monotonicity property, this paper explores the application of Wyner´s common information to inference problems and its connection with other performance metrics. A central question is that under what conditions Wyner´s common information captures the entire information contained in the observations about the inference object under a simple Bayesian model. For infinitely exchangeable random variables, it is shown using the de Finetti-Hewitt-Savage theorem that the common information is asymptotically equal to the information of the inference object. For finite exchangeable random variables, such conclusion is no longer true even for infinitely extendable sequences. However, for some special cases, including both the binary and the Gaussian cases, concrete connection between common information and inference performance metrics can be established even for finite samples.
Keywords :
Bayes methods; Gaussian processes; information theory; Bayesian model; Finetti-Hewitt-Savage theorem; Gaussian case; Wyner common information; binary case; dependent random variables; discrete random variables; inference problems; monotonicity property; performance metrics; Additives; Bayesian methods; Joints; Measurement; Mutual information; Random variables;
Conference_Titel :
Communication, Control, and Computing (Allerton), 2011 49th Annual Allerton Conference on
Conference_Location :
Monticello, IL
Print_ISBN :
978-1-4577-1817-5
DOI :
10.1109/Allerton.2011.6120347