• DocumentCode
    44180
  • Title

    Least Squares Superposition Codes With Bernoulli Dictionary are Still Reliable at Rates up to Capacity

  • Author

    Takeishi, Yoshinari ; Kawakita, Masanori ; Takeuchi, Jun´ichi

  • Author_Institution
    Grad. Sch. of Inf. Sci. & Electr. Eng., Kyushu Univ., Fukuoka, Japan
  • Volume
    60
  • Issue
    5
  • fYear
    2014
  • fDate
    May-14
  • Firstpage
    2737
  • Lastpage
    2750
  • Abstract
    For the additive white Gaussian noise channel with average power constraint, sparse superposition codes with least squares decoding are proposed by Barron and Joseph in 2010. The codewords are designed by using a dictionary each entry of which is drawn from a Gaussian distribution. The error probability is shown to be exponentially small for all rates up to the capacity. This paper proves that when each entry of the dictionary is drawn from a Bernoulli distribution, the error probability is also exponentially small for all rates up to the capacity. The proof is via a central limit theorem-type inequality, which we show for this analysis.
  • Keywords
    AWGN channels; Gaussian distribution; decoding; error statistics; least squares approximations; Bernoulli dictionary; Bernoulli distribution; Gaussian distribution; additive white Gaussian noise channel; average power constraint; codewords; error probability; least squares decoding; least squares superposition codes; sparse superposition codes; AWGN channels; Decoding; Dictionaries; Error probability; Gaussian distribution; Random variables; Vectors; Central limit theorem; Gaussian channel; channel coding theorem; exponential error bounds; sparse superposition codes;
  • fLanguage
    English
  • Journal_Title
    Information Theory, IEEE Transactions on
  • Publisher
    ieee
  • ISSN
    0018-9448
  • Type

    jour

  • DOI
    10.1109/TIT.2014.2312728
  • Filename
    6776455