DocumentCode :
2517578
Title :
Communication requirements for generating correlated random variables
Author :
Cuff, Paul
Author_Institution :
Dept. of Electr. Eng., Stanford Univ., Stanford, CA
fYear :
2008
fDate :
6-11 July 2008
Firstpage :
1393
Lastpage :
1397
Abstract :
Two familiar notions of correlation are re-discovered as extreme operating points for simulating a discrete memoryless channel, in which a channel output is generated based only on a description of the channel input. Wynerpsilas ldquocommon informationrdquo coincides with the minimum description rate needed. However, when common randomness independent of the input is available, the necessary description rate reduces to Shannonpsilas mutual information. This work characterizes the optimal tradeoff between the amount of common randomness used and the required rate of description.
Keywords :
information theory; Shannonpsilas mutual information; communication requirements; correlated random variables; discrete memoryless channel; Data mining; Decoding; Entropy; Estimation error; Memoryless systems; Mutual information; Random number generation; Random variables; Source coding; Testing;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Information Theory, 2008. ISIT 2008. IEEE International Symposium on
Conference_Location :
Toronto, ON
Print_ISBN :
978-1-4244-2256-2
Electronic_ISBN :
978-1-4244-2257-9
Type :
conf
DOI :
10.1109/ISIT.2008.4595216
Filename :
4595216
Link To Document :
بازگشت