Title :
Average-case interactive communication
Author_Institution :
AT&T Bell Labs., Murray Hill, NJ, USA
fDate :
9/1/1992 12:00:00 AM
Abstract :
X and Y are random variables. Person Px knows X, Person Py knows Y, and both know the joint probability distribution of the pair (X,Y). Using a predetermined protocol, they communicate over a binary error-free channel in order for Py to learn X. Px may or may not learn Y. It is determined how many information bits must be transmitted (by both persons) on the average. The results show that, when the arithmetic average number of bits is considered, there is no asymptotic advantage to Px knowing Y in advance and four messages are asymptotically optimum. By contrast, for the worst-case number of bits, communication can be significantly reduced if Px knows Y in advance, and it is not known whether a constant number of messages is asymptotically optimum
Keywords :
information theory; protocols; Slepian-Wolf theorem; arithmetic average number of bits; binary error-free channel; communication complexity; information bits; interactive communication; joint probability distribution; predetermined protocol; random variables; Arithmetic; Data compression; Entropy; Probability distribution; Protocols; Random variables; Transmitters;
Journal_Title :
Information Theory, IEEE Transactions on