DocumentCode :
1017995
Title :
Three results on interactive communication
Author :
Naor, Moni ; Orlitsky, Alon
Author_Institution :
Dept. of Appl. Math. & Comput. Sci., Weizmann Inst. of Sci., Rehovot
Volume :
39
Issue :
5
fYear :
1993
fDate :
9/1/1993 12:00:00 AM
Firstpage :
1608
Lastpage :
1615
Abstract :
X and Y are random variables. Person Px knows X, Person Py knows Y, and both know the underlying probability distribution of the random pair (X, Y). Using a predetermined protocol, they exchange messages over a binary, error-free, channel in order for P y to learn X. Px may or may not learn Y. Cˆ m is the number of information bits that must be transmitted (by both persons) in the worst case if only m messages are allowed. Cˆ is the corresponding number of bits when there is no restriction on the number of messages exchanged. We consider three aspects of this problem. Cˆ4. It is known that one-message communication may require exponentially more bits than the minimum possible: for some random pairs, Cˆ1=2Cˆ∞-1. Yet just two messages suffice to reduce communication to almost the minimum: for all random pairs, Cˆ2⩽4Cˆ+3. We show that, asymptotically, four messages require at most three times the minimum number of bits: for all random pairs, Cˆ4⩽3Cˆ+o(Cˆ). Balanced pairs. Let μˆ be the maximum number of X values possible with a given Y value, and let ηˆ be the maximum number of Y values possible with a given X value. A random pair is balanced if μˆ=ηˆ. It is known that for all balanced pairs, three messages require at most log μˆ+o(log μˆ) bits, hence are asymptotically optimum. It was not known whether two messages are asymptotically optimum. We show that for every c and positive E there is a balanced pair such that Cˆ2⩾(2-∈)Cˆ⩾c. Asymptotically, this is the largest possible discrepancy. Amortized complexity. The amortized complexity of (X,Y) is the limit, as k grows, of the number of bits required in the worst case for L independent repetitions of (X, Y), normalized by k. We show that the four-message amortized complexity of all random pairs is exactly log μˆ. Hence, when a random pair is repeated many times, no bits can be saved if Px knows Y in advance
Keywords :
communication complexity; information theory; probability; protocols; telecommunication channels; amortized complexity; asymptotically optimum messages; balanced pairs; binary error-free channel; communication complexity; interactive communication; predetermined protocol; probability distribution; random pair; random variables.; Computer errors; Computer science; Data compression; Information theory; Probability distribution; Protocols; Random variables;
fLanguage :
English
Journal_Title :
Information Theory, IEEE Transactions on
Publisher :
ieee
ISSN :
0018-9448
Type :
jour
DOI :
10.1109/18.259644
Filename :
259644
Link To Document :
بازگشت