Title :
Bounds on inference
Author :
Calmon, Flavio P. ; Varia, Mayank ; Medard, Muriel ; Christiansen, Mark M. ; Duffy, Ken R. ; Tessaro, Stefano
Author_Institution :
Res. Lab. of Electron., Massachusetts Inst. of Technol., Cambridge, MA, USA
Abstract :
Lower bounds for the average probability of error of estimating a hidden variable X given an observation of a correlated random variable Y, and Fano´s inequality in particular, play a central role in information theory. In this paper, we present a lower bound for the average estimation error based on the marginal distribution of X and the principal inertias of the joint distribution matrix of X and Y. Furthermore, we discuss an information measure based on the sum of the largest principal inertias, called k-correlation, which generalizes maximal correlation. We show that k-correlation satisfies the Data Processing Inequality and is convex in the conditional distribution of Y given X. Finally, we investigate how to answer a fundamental question in inference and privacy: given an observation Y, can we estimate a function f(X) of the hidden random variable X with an average error below a certain threshold? We provide a general method for answering this question using an approach based on rate-distortion theory.
Keywords :
error statistics; estimation theory; matrix algebra; rate distortion theory; Fano inequality; average estimation error; average probability of error; data processing inequality; information theory; joint distribution matrix; k-correlation; marginal distribution; maximal correlation; principal inertias; rate-distortion theory; Correlation; Data processing; Estimation error; Joints; Matrix decomposition; Random variables; Security;
Conference_Titel :
Communication, Control, and Computing (Allerton), 2013 51st Annual Allerton Conference on
Conference_Location :
Monticello, IL
Print_ISBN :
978-1-4799-3409-6
DOI :
10.1109/Allerton.2013.6736575