DocumentCode :
1761933
Title :
Network Compression: Worst Case Analysis
Author :
Asnani, Himanshu ; Shomorony, Ilan ; Avestimehr, A. Salman ; Weissman, Tsachy
Author_Institution :
Ericsson R&D Sillicon Valley, San Jose, CA, USA
Volume :
61
Issue :
7
fYear :
2015
fDate :
42186
Firstpage :
3980
Lastpage :
3995
Abstract :
We study the problem of communicating a distributed correlated memoryless source over a memoryless network, from source nodes to destination nodes, under quadratic distortion constraints. We establish the following two complementary results: 1) for an arbitrary memoryless network, among all distributed memoryless sources of a given correlation, Gaussian sources are least compressible, that is, they admit the smallest set of achievable distortion tuples and 2) for any memoryless source to be communicated over a memoryless additive-noise network, among all noise processes of a given correlation, Gaussian noise admits the smallest achievable set of distortion tuples. We establish these results constructively by showing how schemes for the corresponding Gaussian problems can be applied to achieve similar performance for (source or noise) distributions that are not necessarily Gaussian but have the same covariance.
Keywords :
AWGN channels; Gaussian distribution; combined source-channel coding; correlation theory; data compression; memoryless systems; network coding; Gaussian distribution; Gaussian noise; Gaussian source; arbitrary memoryless network; correlation theory; distortion tuples; distributed correlated memoryless source; memoryless additive noise network; memoryless network; network compression; source node; worst case analysis; Covariance matrices; Decoding; Distortion; Joints; Noise; Source coding; Worst-case source; joint source-channel coding; network compression; worst-case noise;
fLanguage :
English
Journal_Title :
Information Theory, IEEE Transactions on
Publisher :
ieee
ISSN :
0018-9448
Type :
jour
DOI :
10.1109/TIT.2015.2434829
Filename :
7122879
Link To Document :
بازگشت