DocumentCode
65053
Title
Achieving AWGN Channel Capacity With Lattice Gaussian Coding
Author
Cong Ling ; Belfiore, Jean-Claude
Author_Institution
Dept. of Electr. & Electron. Eng., Imperial Coll. London, London, CA, USA
Volume
60
Issue
10
fYear
2014
fDate
Oct. 2014
Firstpage
5918
Lastpage
5929
Abstract
We propose a new coding scheme using only one lattice that achieves the 1/2 log(1 + SNR) capacity of the additive white Gaussian noise (AWGN) channel with lattice decoding, which is provable for signal-to-noise ratio SNR > e at present. The scheme applies a discrete Gaussian distribution over an AWGN-good lattice, but otherwise does not require a shaping lattice or dither. Thus, it significantly simplifies the default lattice coding scheme of Erez and Zamir which involves a quantization good lattice as well as an AWGN-good lattice. Using the flatness factor, we show that the error probability of the proposed scheme under minimum mean-square error lattice decoding is almost the same as that of Erez and Zamir, for any rate up to the AWGN channel capacity. We introduce the notion of good constellations, which carry almost the same mutual information as that of continuous Gaussian inputs. We also address the implementation of Gaussian shaping for the proposed lattice Gaussian coding scheme.
Keywords
AWGN channels; Gaussian channels; channel coding; mean square error methods; signal processing; Gaussian coding scheme; Gaussian shaping; achieving AWGN channel capacity; additive white Gaussian noise; continuous Gaussian inputs; discrete Gaussian distribution; error probability; lattice Gaussian coding; lattice decoding; mean-square error lattice decoding; AWGN channels; Approximation methods; Decoding; Encoding; Gaussian distribution; Lattices; Signal to noise ratio; Channel capacity; MMSE; flatness factor; lattice Gaussian distribution; lattice coding;
fLanguage
English
Journal_Title
Information Theory, IEEE Transactions on
Publisher
ieee
ISSN
0018-9448
Type
jour
DOI
10.1109/TIT.2014.2332343
Filename
6841610
Link To Document