Title :
Communication requirements for generating correlated random variables
Author_Institution :
Dept. of Electr. Eng., Stanford Univ., Stanford, CA
Abstract :
Two familiar notions of correlation are re-discovered as extreme operating points for simulating a discrete memoryless channel, in which a channel output is generated based only on a description of the channel input. Wynerpsilas ldquocommon informationrdquo coincides with the minimum description rate needed. However, when common randomness independent of the input is available, the necessary description rate reduces to Shannonpsilas mutual information. This work characterizes the optimal tradeoff between the amount of common randomness used and the required rate of description.
Keywords :
information theory; Shannonpsilas mutual information; communication requirements; correlated random variables; discrete memoryless channel; Data mining; Decoding; Entropy; Estimation error; Memoryless systems; Mutual information; Random number generation; Random variables; Source coding; Testing;
Conference_Titel :
Information Theory, 2008. ISIT 2008. IEEE International Symposium on
Conference_Location :
Toronto, ON
Print_ISBN :
978-1-4244-2256-2
Electronic_ISBN :
978-1-4244-2257-9
DOI :
10.1109/ISIT.2008.4595216