Author :
Erez, Uri ; Shamai, Shlomo ; Zamir, Ram
Abstract :
We consider the generalized dirty-paper channel Y=X+S+N,E{X2}≤PX, where N is not necessarily Gaussian, and the interference S is known causally or noncausally to the transmitter. We derive worst case capacity formulas and strategies for "strong" or arbitrarily varying interference. In the causal side information (SI) case, we develop a capacity formula based on minimum noise entropy strategies. We then show that strategies associated with entropy-constrained quantizers provide lower and upper bounds on the capacity. At high signal-to-noise ratio (SNR) conditions, i.e., if N is weak relative to the power constraint PX, these bounds coincide, the optimum strategies take the form of scalar lattice quantizers, and the capacity loss due to not having S at the receiver is shown to be exactly the "shaping gain" 1/2log(2πe/12)≈ 0.254 bit. We extend the schemes to obtain achievable rates at any SNR and to noncausal SI, by incorporating minimum mean-squared error (MMSE) scaling, and by using k-dimensional lattices. For Gaussian N, the capacity loss of this scheme is upper-bounded by 1/2log2πeG(Λ), where G(Λ) is the normalized second moment of the lattice. With a proper choice of lattice, the loss goes to zero as the dimension k goes to infinity, in agreement with the results of Costa. These results provide an information-theoretic framework for the study of common communication problems such as precoding for intersymbol interference (ISI) channels and broadcast channels.
Keywords :
broadcast channels; channel capacity; channel coding; entropy codes; interference suppression; intersymbol interference; mean square error methods; quantisation (signal); random codes; ISI channel; MMSE scaling; broadcast channel; capacity formula; causal SI; entropy-constrained quantizer; generalized dirty-paper channel; information-theoretic framework; intersymbol interference; known interference canceling; minimum mean-squared error; minimum noise entropy strategy; noncausal SI; power constraint; precoding; randomized code; scalar lattice quantizer; side information; Additive noise; Entropy; Gaussian noise; Information theory; Interference cancellation; Intersymbol interference; Lattices; Signal to noise ratio; Transmitters; Upper bound; Causal side information (SI); common randomness; dirty-paper channel; dither; interference; minimum mean-squared error (MMSE) estimation; noncausal SI; precoding; randomized code;